Mar 16 00:06:45 crc systemd[1]: Starting Kubernetes Kubelet... Mar 16 00:06:45 crc restorecon[4695]: Relabeled /var/lib/kubelet/config.json from system_u:object_r:unlabeled_t:s0 to system_u:object_r:container_var_lib_t:s0 Mar 16 00:06:45 crc restorecon[4695]: /var/lib/kubelet/device-plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 16 00:06:45 crc restorecon[4695]: /var/lib/kubelet/device-plugins/kubelet.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 16 00:06:45 crc restorecon[4695]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/volumes/kubernetes.io~configmap/nginx-conf/..2025_02_23_05_40_35.4114275528/nginx.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Mar 16 00:06:45 crc restorecon[4695]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Mar 16 00:06:45 crc restorecon[4695]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/22e96971 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Mar 16 00:06:45 crc restorecon[4695]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/21c98286 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Mar 16 00:06:45 crc restorecon[4695]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/0f1869e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Mar 16 00:06:45 crc restorecon[4695]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Mar 16 00:06:45 crc restorecon[4695]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/46889d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Mar 16 00:06:45 crc restorecon[4695]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/5b6a5969 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Mar 16 00:06:45 crc restorecon[4695]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/6c7921f5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Mar 16 00:06:45 crc restorecon[4695]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4804f443 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Mar 16 00:06:45 crc restorecon[4695]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/2a46b283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Mar 16 00:06:45 crc restorecon[4695]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/a6b5573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Mar 16 00:06:45 crc restorecon[4695]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4f88ee5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Mar 16 00:06:45 crc restorecon[4695]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/5a4eee4b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Mar 16 00:06:45 crc restorecon[4695]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/cd87c521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Mar 16 00:06:45 crc restorecon[4695]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 16 00:06:45 crc restorecon[4695]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 16 00:06:45 crc restorecon[4695]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 16 00:06:45 crc restorecon[4695]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 16 00:06:45 crc restorecon[4695]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 16 00:06:45 crc restorecon[4695]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 16 00:06:45 crc restorecon[4695]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/38602af4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 16 00:06:45 crc restorecon[4695]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/1483b002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 16 00:06:45 crc restorecon[4695]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/0346718b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 16 00:06:45 crc restorecon[4695]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/d3ed4ada not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 16 00:06:45 crc restorecon[4695]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/3bb473a5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 16 00:06:45 crc restorecon[4695]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/8cd075a9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 16 00:06:45 crc restorecon[4695]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/00ab4760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 16 00:06:45 crc restorecon[4695]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/54a21c09 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 16 00:06:45 crc restorecon[4695]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Mar 16 00:06:45 crc restorecon[4695]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/70478888 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Mar 16 00:06:45 crc restorecon[4695]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/43802770 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Mar 16 00:06:45 crc restorecon[4695]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/955a0edc not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Mar 16 00:06:45 crc restorecon[4695]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/bca2d009 not reset as customized by admin to system_u:object_r:container_file_t:s0:c140,c1009 Mar 16 00:06:45 crc restorecon[4695]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/b295f9bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Mar 16 00:06:45 crc restorecon[4695]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 16 00:06:45 crc restorecon[4695]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 16 00:06:45 crc restorecon[4695]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 16 00:06:45 crc restorecon[4695]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 16 00:06:45 crc restorecon[4695]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 16 00:06:45 crc restorecon[4695]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 16 00:06:45 crc restorecon[4695]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 16 00:06:45 crc restorecon[4695]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 16 00:06:45 crc restorecon[4695]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 16 00:06:45 crc restorecon[4695]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 16 00:06:45 crc restorecon[4695]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 16 00:06:45 crc restorecon[4695]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/bc46ea27 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 16 00:06:45 crc restorecon[4695]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5731fc1b not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 16 00:06:45 crc restorecon[4695]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5e1b2a3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 16 00:06:45 crc restorecon[4695]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/943f0936 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 16 00:06:45 crc restorecon[4695]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/3f764ee4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 16 00:06:45 crc restorecon[4695]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/8695e3f9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 16 00:06:45 crc restorecon[4695]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/aed7aa86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 16 00:06:45 crc restorecon[4695]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/c64d7448 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 16 00:06:45 crc restorecon[4695]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/0ba16bd2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 16 00:06:45 crc restorecon[4695]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/207a939f not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 16 00:06:45 crc restorecon[4695]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/54aa8cdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 16 00:06:45 crc restorecon[4695]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/1f5fa595 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 16 00:06:45 crc restorecon[4695]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/bf9c8153 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 16 00:06:45 crc restorecon[4695]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/47fba4ea not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 16 00:06:45 crc restorecon[4695]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/7ae55ce9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 16 00:06:45 crc restorecon[4695]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7906a268 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 16 00:06:45 crc restorecon[4695]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/ce43fa69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 16 00:06:45 crc restorecon[4695]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7fc7ea3a not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 16 00:06:45 crc restorecon[4695]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/d8c38b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 16 00:06:45 crc restorecon[4695]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/9ef015fb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 16 00:06:45 crc restorecon[4695]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/b9db6a41 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 16 00:06:45 crc restorecon[4695]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Mar 16 00:06:45 crc restorecon[4695]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/b1733d79 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Mar 16 00:06:45 crc restorecon[4695]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/afccd338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Mar 16 00:06:45 crc restorecon[4695]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/9df0a185 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Mar 16 00:06:45 crc restorecon[4695]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/18938cf8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Mar 16 00:06:45 crc restorecon[4695]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/7ab4eb23 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Mar 16 00:06:45 crc restorecon[4695]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/56930be6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Mar 16 00:06:45 crc restorecon[4695]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 16 00:06:45 crc restorecon[4695]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_35.630010865 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 16 00:06:45 crc restorecon[4695]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 16 00:06:45 crc restorecon[4695]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 16 00:06:45 crc restorecon[4695]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 16 00:06:45 crc restorecon[4695]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 16 00:06:45 crc restorecon[4695]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 16 00:06:45 crc restorecon[4695]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 16 00:06:45 crc restorecon[4695]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 16 00:06:45 crc restorecon[4695]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/0d8e3722 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Mar 16 00:06:45 crc restorecon[4695]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/d22b2e76 not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Mar 16 00:06:45 crc restorecon[4695]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/e036759f not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 16 00:06:45 crc restorecon[4695]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/2734c483 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Mar 16 00:06:45 crc restorecon[4695]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/57878fe7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Mar 16 00:06:45 crc restorecon[4695]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/3f3c2e58 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Mar 16 00:06:45 crc restorecon[4695]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/375bec3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Mar 16 00:06:45 crc restorecon[4695]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/7bc41e08 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 16 00:06:45 crc restorecon[4695]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 16 00:06:45 crc restorecon[4695]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/48c7a72d not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 16 00:06:45 crc restorecon[4695]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/4b66701f not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 16 00:06:45 crc restorecon[4695]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/a5a1c202 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 16 00:06:45 crc restorecon[4695]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 16 00:06:45 crc restorecon[4695]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 16 00:06:45 crc restorecon[4695]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 16 00:06:45 crc restorecon[4695]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 16 00:06:45 crc restorecon[4695]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 16 00:06:45 crc restorecon[4695]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 16 00:06:45 crc restorecon[4695]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 16 00:06:45 crc restorecon[4695]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 16 00:06:45 crc restorecon[4695]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_40.1388695756 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 16 00:06:45 crc restorecon[4695]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 16 00:06:45 crc restorecon[4695]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 16 00:06:45 crc restorecon[4695]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/26f3df5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 16 00:06:45 crc restorecon[4695]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/6d8fb21d not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 16 00:06:45 crc restorecon[4695]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/50e94777 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 16 00:06:45 crc restorecon[4695]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208473b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 16 00:06:45 crc restorecon[4695]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/ec9e08ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 16 00:06:45 crc restorecon[4695]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3b787c39 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 16 00:06:45 crc restorecon[4695]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208eaed5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 16 00:06:45 crc restorecon[4695]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/93aa3a2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 16 00:06:45 crc restorecon[4695]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3c697968 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 16 00:06:45 crc restorecon[4695]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 16 00:06:45 crc restorecon[4695]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/ba950ec9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 16 00:06:45 crc restorecon[4695]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/cb5cdb37 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 16 00:06:45 crc restorecon[4695]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/f2df9827 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 16 00:06:45 crc restorecon[4695]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 16 00:06:45 crc restorecon[4695]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 16 00:06:45 crc restorecon[4695]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 16 00:06:45 crc restorecon[4695]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 16 00:06:45 crc restorecon[4695]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 16 00:06:45 crc restorecon[4695]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 16 00:06:45 crc restorecon[4695]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 16 00:06:45 crc restorecon[4695]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 16 00:06:45 crc restorecon[4695]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 16 00:06:45 crc restorecon[4695]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 16 00:06:45 crc restorecon[4695]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 16 00:06:45 crc restorecon[4695]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/fedaa673 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 16 00:06:45 crc restorecon[4695]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/9ca2df95 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 16 00:06:45 crc restorecon[4695]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/b2d7460e not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 16 00:06:45 crc restorecon[4695]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2207853c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 16 00:06:45 crc restorecon[4695]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/241c1c29 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 16 00:06:45 crc restorecon[4695]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2d910eaf not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 16 00:06:45 crc restorecon[4695]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 16 00:06:45 crc restorecon[4695]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 16 00:06:45 crc restorecon[4695]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 16 00:06:45 crc restorecon[4695]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 16 00:06:45 crc restorecon[4695]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 16 00:06:45 crc restorecon[4695]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 16 00:06:45 crc restorecon[4695]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 16 00:06:45 crc restorecon[4695]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 16 00:06:45 crc restorecon[4695]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 16 00:06:45 crc restorecon[4695]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 16 00:06:45 crc restorecon[4695]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 16 00:06:45 crc restorecon[4695]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 16 00:06:45 crc restorecon[4695]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 16 00:06:45 crc restorecon[4695]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 16 00:06:45 crc restorecon[4695]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 16 00:06:45 crc restorecon[4695]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 16 00:06:45 crc restorecon[4695]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/c6c0f2e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Mar 16 00:06:45 crc restorecon[4695]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/399edc97 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Mar 16 00:06:45 crc restorecon[4695]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8049f7cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Mar 16 00:06:45 crc restorecon[4695]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/0cec5484 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Mar 16 00:06:45 crc restorecon[4695]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/312446d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c406,c828 Mar 16 00:06:45 crc restorecon[4695]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8e56a35d not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 16 00:06:45 crc restorecon[4695]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 16 00:06:45 crc restorecon[4695]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 16 00:06:45 crc restorecon[4695]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 16 00:06:45 crc restorecon[4695]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 16 00:06:45 crc restorecon[4695]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 16 00:06:45 crc restorecon[4695]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 16 00:06:45 crc restorecon[4695]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/2d30ddb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Mar 16 00:06:45 crc restorecon[4695]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/eca8053d not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Mar 16 00:06:45 crc restorecon[4695]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/c3a25c9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c168,c522 Mar 16 00:06:45 crc restorecon[4695]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/b9609c22 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 16 00:06:45 crc restorecon[4695]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Mar 16 00:06:45 crc restorecon[4695]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/e8b0eca9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Mar 16 00:06:45 crc restorecon[4695]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/b36a9c3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Mar 16 00:06:45 crc restorecon[4695]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/38af7b07 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/ae821620 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/baa23338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/2c534809 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/59b29eae not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/c91a8e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/4d87494a not reset as customized by admin to system_u:object_r:container_file_t:s0:c442,c857 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/1e33ca63 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/8dea7be2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d0b04a99 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d84f01e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/4109059b not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/a7258a3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/05bdf2b6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/f3261b51 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/315d045e not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/5fdcf278 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/d053f757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/c2850dc7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fcfb0b2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c7ac9b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fa0c0d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c609b6ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/2be6c296 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/89a32653 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/4eb9afeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/13af6efa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/b03f9724 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/e3d105cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/3aed4d83 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/0765fa6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/2cefc627 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/3dcc6345 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/365af391 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b1130c0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/236a5913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b9432e26 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/5ddb0e3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/986dc4fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/8a23ff9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/9728ae68 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/665f31d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/136c9b42 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/98a1575b not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/cac69136 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/5deb77a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/2ae53400 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/e46f2326 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/dc688d3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/3497c3cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/177eb008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/af5a2afa not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/d780cb1f not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/49b0f374 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/26fbb125 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/cf14125a not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/b7f86972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/e51d739c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/88ba6a69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/669a9acf not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/5cd51231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/75349ec7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/15c26839 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/45023dcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/2bb66a50 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/64d03bdd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/ab8e7ca0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/bb9be25f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/9a0b61d3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/d471b9d2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/8cb76b8e not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/11a00840 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/ec355a92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/992f735e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d59cdbbc not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/72133ff0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/c56c834c not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d13724c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/0a498258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa471982 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fc900d92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa7d68da not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/4bacf9b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/424021b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/fc2e31a3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/f51eefac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/c8997f2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/7481f599 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/fdafea19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/d0e1c571 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/ee398915 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/682bb6b8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a3e67855 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a989f289 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/915431bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/7796fdab not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/dcdb5f19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/a3aaa88c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/5508e3e6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/160585de not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/e99f8da3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/8bc85570 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/a5861c91 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/84db1135 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/9e1a6043 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/c1aba1c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/d55ccd6d not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/971cc9f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/8f2e3dcf not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/ceb35e9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/1c192745 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/5209e501 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/f83de4df not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/e7b978ac not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/c64304a1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/5384386b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/cce3e3ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/8fb75465 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/740f573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/32fd1134 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/0a861bd3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/80363026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/bfa952a8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..2025_02_23_05_33_31.333075221 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/793bf43d not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/7db1bb6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/4f6a0368 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/c12c7d86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/36c4a773 not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/4c1e98ae not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/a4c8115c not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/setup/7db1802e not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver/a008a7ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-syncer/2c836bac not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-regeneration-controller/0ce62299 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-insecure-readyz/945d2457 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-check-endpoints/7d5c1dd8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/index.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/bundle-v1.15.0.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/channel.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/package.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/bc8d0691 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/6b76097a not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/34d1af30 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/312ba61c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/645d5dd1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/16e825f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/4cf51fc9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/2a23d348 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/075dbd49 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/dd585ddd not reset as customized by admin to system_u:object_r:container_file_t:s0:c377,c642 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/17ebd0ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c343 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/005579f4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_23_11.1287037894 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/bf5f3b9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/af276eb7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/ea28e322 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/692e6683 not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/871746a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/4eb2e958 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/ca9b62da not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/0edd6fce not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/containers/controller-manager/89b4555f not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/655fcd71 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/0d43c002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/e68efd17 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/9acf9b65 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/5ae3ff11 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/1e59206a not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/27af16d1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c304,c1017 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/7918e729 not reset as customized by admin to system_u:object_r:container_file_t:s0:c853,c893 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/5d976d0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c585,c981 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/d7f55cbb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/f0812073 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/1a56cbeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/7fdd437e not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/cdfb5652 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/fix-audit-permissions/fb93119e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver/f1e8fc0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver-check-endpoints/218511f3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server/serving-certs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/ca8af7b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/72cc8a75 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/6e8a3760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4c3455c0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/2278acb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4b453e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/3ec09bda not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2/cacerts.bin not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java/cacerts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl/ca-bundle.trust.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/email-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/objsign-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2ae6433e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fde84897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75680d2e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/openshift-service-serving-signer_1740288168.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/facfc4fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f5a969c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CFCA_EV_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9ef4a08a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ingress-operator_1740288202.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2f332aed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/248c8271.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d10a21f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ACCVRAIZ1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a94d09e5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c9a4d3b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40193066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd8c0d63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b936d1c6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CA_Disig_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4fd49c6c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM_SERVIDORES_SEGUROS.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b81b93f0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f9a69fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b30d5fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ANF_Secure_Server_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b433981b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93851c9e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9282e51c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7dd1bc4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Actalis_Authentication_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/930ac5d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f47b495.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e113c810.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5931b5bc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Commercial.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2b349938.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e48193cf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/302904dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a716d4ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Networking.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93bc0acc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/86212b19.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b727005e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbc54cab.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f51bb24c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c28a8a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9c8dfbd4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ccc52f49.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cb1c3204.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ce5e74ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd08c599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6d41d539.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb5fa911.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e35234b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8cb5ee0f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a7c655d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f8fc53da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/de6d66f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d41b5e2a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/41a3f684.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1df5a75f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_2011.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e36a6752.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b872f2b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9576d26b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/228f89db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_ECC_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb717492.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d21b73c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b1b94ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/595e996b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_RSA_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b46e03d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/128f4b91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_3_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81f2d2b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Autoridad_de_Certificacion_Firmaprofesional_CIF_A62634068.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3bde41ac.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d16a5865.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_EC-384_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0179095f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ffa7f1eb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9482e63a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4dae3dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e359ba6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7e067d03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/95aff9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7746a63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Baltimore_CyberTrust_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/653b494a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3ad48a91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_2_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/54657681.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/82223c44.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8de2f56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d9dafe4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d96b65e2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee64a828.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40547a79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5a3f0ff8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a780d93.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/34d996fb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/eed8c118.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/89c02a45.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b1159c4c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d6325660.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4c339cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8312c4c1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_E1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8508e720.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5fdd185d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48bec511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/69105f4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b9bc432.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/32888f65.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b03dec0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/219d9499.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5acf816d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbf06781.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc99f41e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AAA_Certificate_Services.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/985c1f52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8794b4e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_BR_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7c037b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ef954a4e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_EV_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2add47b6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/90c5a3c8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0f3e76e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/53a1b57a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_EV_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5ad8a5d6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/68dd7389.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d04f354.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d6437c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/062cdee6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bd43e1dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7f3d5d1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c491639e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3513523f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/399e7759.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/feffd413.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d18e9066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/607986c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c90bc37d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1b0f7e5c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e08bfd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dd8e9d41.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed39abd0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a3418fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bc3f2570.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_High_Assurance_EV_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/244b5494.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81b9768f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4be590e0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_ECC_P384_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9846683b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/252252d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e8e7201.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_RSA4096_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d52c538d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c44cc0c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Trusted_Root_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75d1b2ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a2c66da8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ecccd8db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust.net_Certification_Authority__2048_.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/aee5f10d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e7271e8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0e59380.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4c3982f2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b99d060.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf64f35b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0a775a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/002c0b4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cc450945.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_EC1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/106f3e4d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b3fb433b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4042bcee.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/02265526.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/455f1b52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0d69c7e1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9f727ac7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5e98733a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0cd152c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc4d6a89.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6187b673.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/FIRMAPROFESIONAL_CA_ROOT-A_WEB.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ba8887ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/068570d1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f081611a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48a195d8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GDCA_TrustAUTH_R5_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f6fa695.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab59055e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b92fd57f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GLOBALTRUST_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fa5da96b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ec40989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7719f463.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1001acf7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f013ecaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/626dceaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c559d742.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1d3472b9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9479c8c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a81e292b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4bfab552.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e071171e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/57bcb2da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_ECC_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab5346f4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5046c355.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_RSA_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/865fbdf9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da0cfd1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/85cde254.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_ECC_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbb3f32b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureSign_RootCA11.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5860aaa6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/31188b5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HiPKI_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c7f1359b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f15c80c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hongkong_Post_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/09789157.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/18856ac4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e09d511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Commercial_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cf701eeb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d06393bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Public_Sector_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/10531352.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Izenpe.com.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureTrust_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0ed035a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsec_e-Szigno_Root_CA_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8160b96c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8651083.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2c63f966.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_ECC_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d89cda1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/01419da9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_RSA_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7a5b843.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_RSA_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf53fb88.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9591a472.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3afde786.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Gold_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NAVER_Global_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3fb36b73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d39b0a2c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a89d74c2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd58d51e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7db1890.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NetLock_Arany__Class_Gold__F__tan__s__tv__ny.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/988a38cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/60afe812.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f39fc864.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5443e9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GB_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e73d606e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dfc0fe80.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b66938e9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e1eab7c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GC_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/773e07ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c899c73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d59297b8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ddcda989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_1_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/749e9e03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/52b525c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7e8dc79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a819ef2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/08063a00.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b483515.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/064e0aa9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1f58a078.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6f7454b3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7fa05551.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76faf6c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9339512a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f387163d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee37c333.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e18bfb83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e442e424.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fe8a2cd8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/23f4c490.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5cd81ad7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0c70a8d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7892ad52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SZAFIR_ROOT_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4f316efb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_RSA_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/06dc52d5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/583d0756.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0bf05006.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/88950faa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9046744a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c860d51.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_RSA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6fa5da56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/33ee480d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Secure_Global_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/63a2c897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_ECC_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bdacca6f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ff34af3f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbff3a01.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_ECC_RootCA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_C1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/406c9bb1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_C3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Services_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Silver_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/99e1b953.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/14bc7599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TUBITAK_Kamu_SM_SSL_Kok_Sertifikasi_-_Surum_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a3adc42.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f459871d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_ECC_Root_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_RSA_Root_2023.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TeliaSonera_Root_CA_v1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telia_Root_CA_v2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f103249.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f058632f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-certificates.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9bf03295.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/98aaf404.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1cef98f5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/073bfcc5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2923b3f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f249de83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/edcbddb5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P256_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b5697b0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ae85e5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b74d2bd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P384_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d887a5bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9aef356c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TunTrust_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd64f3fc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e13665f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Extended_Validation_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f5dc4f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da7377f6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Global_G2_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c01eb047.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/304d27c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed858448.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f30dd6ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/04f60c28.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_ECC_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fc5a8f99.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/35105088.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee532fd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/XRamp_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/706f604c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76579174.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d86cdd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/882de061.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f618aec.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a9d40e02.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e-Szigno_Root_CA_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e868b802.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/83e9984f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ePKI_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca6e4ad9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d6523ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4b718d9b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/869fbf79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/containers/registry/f8d22bdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/6e8bbfac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/54dd7996 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/a4f1bb05 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/207129da not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/c1df39e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/15b8f1cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/77bd6913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/2382c1b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/704ce128 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/70d16fe0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/bfb95535 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/57a8e8e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/1b9d3e5e not reset as customized by admin to system_u:object_r:container_file_t:s0:c107,c917 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/fddb173c not reset as customized by admin to system_u:object_r:container_file_t:s0:c202,c983 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/95d3c6c4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/bfb5fff5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/2aef40aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/c0391cad not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/1119e69d not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/660608b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/8220bd53 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/85f99d5c not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/4b0225f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/9c2a3394 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/e820b243 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/1ca52ea0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/e6988e45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/6655f00b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/98bc3986 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/08e3458a not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/2a191cb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/6c4eeefb not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/f61a549c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/24891863 not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/fbdfd89c not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/9b63b3bc not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/8acde6d6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/node-driver-registrar/59ecbba3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/csi-provisioner/685d4be3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/containers/route-controller-manager/feaea55e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/63709497 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/d966b7fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/f5773757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/81c9edb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/57bf57ee not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/86f5e6aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/0aabe31d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/d2af85c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/09d157d9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c0fe7256 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c30319e4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/e6b1dd45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/2bb643f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/920de426 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/70fa1e87 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/a1c12a2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/9442e6c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/5b45ec72 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/3c9f3a59 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/1091c11b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/9a6821c6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/ec0c35e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/517f37e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/6214fe78 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/ba189c8b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/351e4f31 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/c0f219ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/8069f607 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/559c3d82 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/605ad488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/148df488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/3bf6dcb4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/022a2feb not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/938c3924 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/729fe23e not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/1fd5cbd4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/a96697e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/e155ddca not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/10dd0e0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/6f2c8392 not reset as customized by admin to system_u:object_r:container_file_t:s0:c267,c588 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/bd241ad9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/plugins/csi-hostpath not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/plugins/csi-hostpath/csi.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/plugins/kubernetes.io not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/plugins/kubernetes.io/csi not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983 not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/vol_data.json not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 16 00:06:46 crc restorecon[4695]: /var/lib/kubelet/plugins_registry not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 16 00:06:46 crc restorecon[4695]: Relabeled /var/usrlocal/bin/kubenswrapper from system_u:object_r:bin_t:s0 to system_u:object_r:kubelet_exec_t:s0 Mar 16 00:06:47 crc kubenswrapper[4816]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 16 00:06:47 crc kubenswrapper[4816]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Mar 16 00:06:47 crc kubenswrapper[4816]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 16 00:06:47 crc kubenswrapper[4816]: Flag --register-with-taints has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 16 00:06:47 crc kubenswrapper[4816]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Mar 16 00:06:47 crc kubenswrapper[4816]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.371018 4816 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Mar 16 00:06:47 crc kubenswrapper[4816]: W0316 00:06:47.379892 4816 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 16 00:06:47 crc kubenswrapper[4816]: W0316 00:06:47.379925 4816 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 16 00:06:47 crc kubenswrapper[4816]: W0316 00:06:47.379935 4816 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 16 00:06:47 crc kubenswrapper[4816]: W0316 00:06:47.379945 4816 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 16 00:06:47 crc kubenswrapper[4816]: W0316 00:06:47.379953 4816 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 16 00:06:47 crc kubenswrapper[4816]: W0316 00:06:47.379962 4816 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 16 00:06:47 crc kubenswrapper[4816]: W0316 00:06:47.379972 4816 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 16 00:06:47 crc kubenswrapper[4816]: W0316 00:06:47.379980 4816 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 16 00:06:47 crc kubenswrapper[4816]: W0316 00:06:47.379988 4816 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 16 00:06:47 crc kubenswrapper[4816]: W0316 00:06:47.379997 4816 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 16 00:06:47 crc kubenswrapper[4816]: W0316 00:06:47.380006 4816 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 16 00:06:47 crc kubenswrapper[4816]: W0316 00:06:47.380016 4816 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 16 00:06:47 crc kubenswrapper[4816]: W0316 00:06:47.380025 4816 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 16 00:06:47 crc kubenswrapper[4816]: W0316 00:06:47.380033 4816 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 16 00:06:47 crc kubenswrapper[4816]: W0316 00:06:47.380045 4816 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 16 00:06:47 crc kubenswrapper[4816]: W0316 00:06:47.380056 4816 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 16 00:06:47 crc kubenswrapper[4816]: W0316 00:06:47.380065 4816 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 16 00:06:47 crc kubenswrapper[4816]: W0316 00:06:47.380075 4816 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 16 00:06:47 crc kubenswrapper[4816]: W0316 00:06:47.380085 4816 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 16 00:06:47 crc kubenswrapper[4816]: W0316 00:06:47.380093 4816 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 16 00:06:47 crc kubenswrapper[4816]: W0316 00:06:47.380114 4816 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 16 00:06:47 crc kubenswrapper[4816]: W0316 00:06:47.380123 4816 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 16 00:06:47 crc kubenswrapper[4816]: W0316 00:06:47.380134 4816 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 16 00:06:47 crc kubenswrapper[4816]: W0316 00:06:47.380143 4816 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 16 00:06:47 crc kubenswrapper[4816]: W0316 00:06:47.380151 4816 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 16 00:06:47 crc kubenswrapper[4816]: W0316 00:06:47.380159 4816 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 16 00:06:47 crc kubenswrapper[4816]: W0316 00:06:47.380167 4816 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 16 00:06:47 crc kubenswrapper[4816]: W0316 00:06:47.380175 4816 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 16 00:06:47 crc kubenswrapper[4816]: W0316 00:06:47.380183 4816 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 16 00:06:47 crc kubenswrapper[4816]: W0316 00:06:47.380191 4816 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 16 00:06:47 crc kubenswrapper[4816]: W0316 00:06:47.380199 4816 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 16 00:06:47 crc kubenswrapper[4816]: W0316 00:06:47.380206 4816 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 16 00:06:47 crc kubenswrapper[4816]: W0316 00:06:47.380215 4816 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 16 00:06:47 crc kubenswrapper[4816]: W0316 00:06:47.380222 4816 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 16 00:06:47 crc kubenswrapper[4816]: W0316 00:06:47.380230 4816 feature_gate.go:330] unrecognized feature gate: Example Mar 16 00:06:47 crc kubenswrapper[4816]: W0316 00:06:47.380238 4816 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 16 00:06:47 crc kubenswrapper[4816]: W0316 00:06:47.380246 4816 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 16 00:06:47 crc kubenswrapper[4816]: W0316 00:06:47.380254 4816 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 16 00:06:47 crc kubenswrapper[4816]: W0316 00:06:47.380262 4816 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 16 00:06:47 crc kubenswrapper[4816]: W0316 00:06:47.380270 4816 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 16 00:06:47 crc kubenswrapper[4816]: W0316 00:06:47.380280 4816 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 16 00:06:47 crc kubenswrapper[4816]: W0316 00:06:47.380288 4816 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 16 00:06:47 crc kubenswrapper[4816]: W0316 00:06:47.380296 4816 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 16 00:06:47 crc kubenswrapper[4816]: W0316 00:06:47.380303 4816 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 16 00:06:47 crc kubenswrapper[4816]: W0316 00:06:47.380314 4816 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 16 00:06:47 crc kubenswrapper[4816]: W0316 00:06:47.380322 4816 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 16 00:06:47 crc kubenswrapper[4816]: W0316 00:06:47.380331 4816 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 16 00:06:47 crc kubenswrapper[4816]: W0316 00:06:47.380339 4816 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 16 00:06:47 crc kubenswrapper[4816]: W0316 00:06:47.380346 4816 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 16 00:06:47 crc kubenswrapper[4816]: W0316 00:06:47.380354 4816 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 16 00:06:47 crc kubenswrapper[4816]: W0316 00:06:47.380362 4816 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 16 00:06:47 crc kubenswrapper[4816]: W0316 00:06:47.380369 4816 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 16 00:06:47 crc kubenswrapper[4816]: W0316 00:06:47.380376 4816 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 16 00:06:47 crc kubenswrapper[4816]: W0316 00:06:47.380385 4816 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 16 00:06:47 crc kubenswrapper[4816]: W0316 00:06:47.380392 4816 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 16 00:06:47 crc kubenswrapper[4816]: W0316 00:06:47.380399 4816 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 16 00:06:47 crc kubenswrapper[4816]: W0316 00:06:47.380407 4816 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 16 00:06:47 crc kubenswrapper[4816]: W0316 00:06:47.380417 4816 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 16 00:06:47 crc kubenswrapper[4816]: W0316 00:06:47.380427 4816 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 16 00:06:47 crc kubenswrapper[4816]: W0316 00:06:47.380436 4816 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 16 00:06:47 crc kubenswrapper[4816]: W0316 00:06:47.380445 4816 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 16 00:06:47 crc kubenswrapper[4816]: W0316 00:06:47.380453 4816 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 16 00:06:47 crc kubenswrapper[4816]: W0316 00:06:47.380462 4816 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 16 00:06:47 crc kubenswrapper[4816]: W0316 00:06:47.380470 4816 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 16 00:06:47 crc kubenswrapper[4816]: W0316 00:06:47.380478 4816 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 16 00:06:47 crc kubenswrapper[4816]: W0316 00:06:47.380486 4816 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 16 00:06:47 crc kubenswrapper[4816]: W0316 00:06:47.380494 4816 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 16 00:06:47 crc kubenswrapper[4816]: W0316 00:06:47.380501 4816 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 16 00:06:47 crc kubenswrapper[4816]: W0316 00:06:47.380512 4816 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 16 00:06:47 crc kubenswrapper[4816]: W0316 00:06:47.380522 4816 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 16 00:06:47 crc kubenswrapper[4816]: W0316 00:06:47.380533 4816 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.380741 4816 flags.go:64] FLAG: --address="0.0.0.0" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.380760 4816 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.380775 4816 flags.go:64] FLAG: --anonymous-auth="true" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.380787 4816 flags.go:64] FLAG: --application-metrics-count-limit="100" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.380798 4816 flags.go:64] FLAG: --authentication-token-webhook="false" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.380809 4816 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.380822 4816 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.380834 4816 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.380845 4816 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.380855 4816 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.380865 4816 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.380876 4816 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.380885 4816 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.380894 4816 flags.go:64] FLAG: --cgroup-root="" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.380903 4816 flags.go:64] FLAG: --cgroups-per-qos="true" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.380913 4816 flags.go:64] FLAG: --client-ca-file="" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.380921 4816 flags.go:64] FLAG: --cloud-config="" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.380930 4816 flags.go:64] FLAG: --cloud-provider="" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.380939 4816 flags.go:64] FLAG: --cluster-dns="[]" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.380949 4816 flags.go:64] FLAG: --cluster-domain="" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.380959 4816 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.380968 4816 flags.go:64] FLAG: --config-dir="" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.380977 4816 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.380987 4816 flags.go:64] FLAG: --container-log-max-files="5" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.380998 4816 flags.go:64] FLAG: --container-log-max-size="10Mi" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.381008 4816 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.381018 4816 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.381027 4816 flags.go:64] FLAG: --containerd-namespace="k8s.io" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.381036 4816 flags.go:64] FLAG: --contention-profiling="false" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.381046 4816 flags.go:64] FLAG: --cpu-cfs-quota="true" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.381054 4816 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.381064 4816 flags.go:64] FLAG: --cpu-manager-policy="none" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.381073 4816 flags.go:64] FLAG: --cpu-manager-policy-options="" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.381094 4816 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.381104 4816 flags.go:64] FLAG: --enable-controller-attach-detach="true" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.381112 4816 flags.go:64] FLAG: --enable-debugging-handlers="true" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.381121 4816 flags.go:64] FLAG: --enable-load-reader="false" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.381131 4816 flags.go:64] FLAG: --enable-server="true" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.381140 4816 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.381152 4816 flags.go:64] FLAG: --event-burst="100" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.381162 4816 flags.go:64] FLAG: --event-qps="50" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.381170 4816 flags.go:64] FLAG: --event-storage-age-limit="default=0" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.381180 4816 flags.go:64] FLAG: --event-storage-event-limit="default=0" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.381189 4816 flags.go:64] FLAG: --eviction-hard="" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.381200 4816 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.381225 4816 flags.go:64] FLAG: --eviction-minimum-reclaim="" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.381235 4816 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.381246 4816 flags.go:64] FLAG: --eviction-soft="" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.381256 4816 flags.go:64] FLAG: --eviction-soft-grace-period="" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.381264 4816 flags.go:64] FLAG: --exit-on-lock-contention="false" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.381274 4816 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.381283 4816 flags.go:64] FLAG: --experimental-mounter-path="" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.381293 4816 flags.go:64] FLAG: --fail-cgroupv1="false" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.381302 4816 flags.go:64] FLAG: --fail-swap-on="true" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.381311 4816 flags.go:64] FLAG: --feature-gates="" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.381322 4816 flags.go:64] FLAG: --file-check-frequency="20s" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.381331 4816 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.381341 4816 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.381350 4816 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.381359 4816 flags.go:64] FLAG: --healthz-port="10248" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.381369 4816 flags.go:64] FLAG: --help="false" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.381378 4816 flags.go:64] FLAG: --hostname-override="" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.381386 4816 flags.go:64] FLAG: --housekeeping-interval="10s" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.381395 4816 flags.go:64] FLAG: --http-check-frequency="20s" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.381405 4816 flags.go:64] FLAG: --image-credential-provider-bin-dir="" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.381414 4816 flags.go:64] FLAG: --image-credential-provider-config="" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.381422 4816 flags.go:64] FLAG: --image-gc-high-threshold="85" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.381432 4816 flags.go:64] FLAG: --image-gc-low-threshold="80" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.381440 4816 flags.go:64] FLAG: --image-service-endpoint="" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.381450 4816 flags.go:64] FLAG: --kernel-memcg-notification="false" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.381459 4816 flags.go:64] FLAG: --kube-api-burst="100" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.381468 4816 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.381477 4816 flags.go:64] FLAG: --kube-api-qps="50" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.381486 4816 flags.go:64] FLAG: --kube-reserved="" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.381497 4816 flags.go:64] FLAG: --kube-reserved-cgroup="" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.381506 4816 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.381516 4816 flags.go:64] FLAG: --kubelet-cgroups="" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.381524 4816 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.381533 4816 flags.go:64] FLAG: --lock-file="" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.381542 4816 flags.go:64] FLAG: --log-cadvisor-usage="false" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.381577 4816 flags.go:64] FLAG: --log-flush-frequency="5s" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.381587 4816 flags.go:64] FLAG: --log-json-info-buffer-size="0" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.381600 4816 flags.go:64] FLAG: --log-json-split-stream="false" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.381610 4816 flags.go:64] FLAG: --log-text-info-buffer-size="0" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.381620 4816 flags.go:64] FLAG: --log-text-split-stream="false" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.381629 4816 flags.go:64] FLAG: --logging-format="text" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.381639 4816 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.381649 4816 flags.go:64] FLAG: --make-iptables-util-chains="true" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.381657 4816 flags.go:64] FLAG: --manifest-url="" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.381667 4816 flags.go:64] FLAG: --manifest-url-header="" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.381688 4816 flags.go:64] FLAG: --max-housekeeping-interval="15s" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.381698 4816 flags.go:64] FLAG: --max-open-files="1000000" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.381710 4816 flags.go:64] FLAG: --max-pods="110" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.381719 4816 flags.go:64] FLAG: --maximum-dead-containers="-1" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.381729 4816 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.381738 4816 flags.go:64] FLAG: --memory-manager-policy="None" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.381748 4816 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.381758 4816 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.381767 4816 flags.go:64] FLAG: --node-ip="192.168.126.11" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.381777 4816 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/control-plane=,node-role.kubernetes.io/master=,node.openshift.io/os_id=rhcos" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.381797 4816 flags.go:64] FLAG: --node-status-max-images="50" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.381808 4816 flags.go:64] FLAG: --node-status-update-frequency="10s" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.381818 4816 flags.go:64] FLAG: --oom-score-adj="-999" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.381827 4816 flags.go:64] FLAG: --pod-cidr="" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.381838 4816 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:33549946e22a9ffa738fd94b1345f90921bc8f92fa6137784cb33c77ad806f9d" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.381851 4816 flags.go:64] FLAG: --pod-manifest-path="" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.381861 4816 flags.go:64] FLAG: --pod-max-pids="-1" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.381872 4816 flags.go:64] FLAG: --pods-per-core="0" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.381883 4816 flags.go:64] FLAG: --port="10250" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.381893 4816 flags.go:64] FLAG: --protect-kernel-defaults="false" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.381904 4816 flags.go:64] FLAG: --provider-id="" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.381914 4816 flags.go:64] FLAG: --qos-reserved="" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.381924 4816 flags.go:64] FLAG: --read-only-port="10255" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.381943 4816 flags.go:64] FLAG: --register-node="true" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.381952 4816 flags.go:64] FLAG: --register-schedulable="true" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.381962 4816 flags.go:64] FLAG: --register-with-taints="node-role.kubernetes.io/master=:NoSchedule" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.381977 4816 flags.go:64] FLAG: --registry-burst="10" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.381987 4816 flags.go:64] FLAG: --registry-qps="5" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.381996 4816 flags.go:64] FLAG: --reserved-cpus="" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.382013 4816 flags.go:64] FLAG: --reserved-memory="" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.382025 4816 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.382036 4816 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.382046 4816 flags.go:64] FLAG: --rotate-certificates="false" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.382056 4816 flags.go:64] FLAG: --rotate-server-certificates="false" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.382072 4816 flags.go:64] FLAG: --runonce="false" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.382081 4816 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.382091 4816 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.382101 4816 flags.go:64] FLAG: --seccomp-default="false" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.382111 4816 flags.go:64] FLAG: --serialize-image-pulls="true" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.382120 4816 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.382130 4816 flags.go:64] FLAG: --storage-driver-db="cadvisor" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.382140 4816 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.382150 4816 flags.go:64] FLAG: --storage-driver-password="root" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.382160 4816 flags.go:64] FLAG: --storage-driver-secure="false" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.382170 4816 flags.go:64] FLAG: --storage-driver-table="stats" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.382179 4816 flags.go:64] FLAG: --storage-driver-user="root" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.382189 4816 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.382198 4816 flags.go:64] FLAG: --sync-frequency="1m0s" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.382209 4816 flags.go:64] FLAG: --system-cgroups="" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.382218 4816 flags.go:64] FLAG: --system-reserved="cpu=200m,ephemeral-storage=350Mi,memory=350Mi" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.382232 4816 flags.go:64] FLAG: --system-reserved-cgroup="" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.382242 4816 flags.go:64] FLAG: --tls-cert-file="" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.382251 4816 flags.go:64] FLAG: --tls-cipher-suites="[]" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.382263 4816 flags.go:64] FLAG: --tls-min-version="" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.382272 4816 flags.go:64] FLAG: --tls-private-key-file="" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.382282 4816 flags.go:64] FLAG: --topology-manager-policy="none" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.382291 4816 flags.go:64] FLAG: --topology-manager-policy-options="" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.382300 4816 flags.go:64] FLAG: --topology-manager-scope="container" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.382310 4816 flags.go:64] FLAG: --v="2" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.382322 4816 flags.go:64] FLAG: --version="false" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.382343 4816 flags.go:64] FLAG: --vmodule="" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.382353 4816 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.382363 4816 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Mar 16 00:06:47 crc kubenswrapper[4816]: W0316 00:06:47.382613 4816 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 16 00:06:47 crc kubenswrapper[4816]: W0316 00:06:47.382626 4816 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 16 00:06:47 crc kubenswrapper[4816]: W0316 00:06:47.382635 4816 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 16 00:06:47 crc kubenswrapper[4816]: W0316 00:06:47.382643 4816 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 16 00:06:47 crc kubenswrapper[4816]: W0316 00:06:47.382651 4816 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 16 00:06:47 crc kubenswrapper[4816]: W0316 00:06:47.382659 4816 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 16 00:06:47 crc kubenswrapper[4816]: W0316 00:06:47.382670 4816 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 16 00:06:47 crc kubenswrapper[4816]: W0316 00:06:47.382681 4816 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 16 00:06:47 crc kubenswrapper[4816]: W0316 00:06:47.382690 4816 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 16 00:06:47 crc kubenswrapper[4816]: W0316 00:06:47.382699 4816 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 16 00:06:47 crc kubenswrapper[4816]: W0316 00:06:47.382707 4816 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 16 00:06:47 crc kubenswrapper[4816]: W0316 00:06:47.382716 4816 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 16 00:06:47 crc kubenswrapper[4816]: W0316 00:06:47.382724 4816 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 16 00:06:47 crc kubenswrapper[4816]: W0316 00:06:47.382734 4816 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 16 00:06:47 crc kubenswrapper[4816]: W0316 00:06:47.382742 4816 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 16 00:06:47 crc kubenswrapper[4816]: W0316 00:06:47.382750 4816 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 16 00:06:47 crc kubenswrapper[4816]: W0316 00:06:47.382759 4816 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 16 00:06:47 crc kubenswrapper[4816]: W0316 00:06:47.382766 4816 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 16 00:06:47 crc kubenswrapper[4816]: W0316 00:06:47.382774 4816 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 16 00:06:47 crc kubenswrapper[4816]: W0316 00:06:47.382782 4816 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 16 00:06:47 crc kubenswrapper[4816]: W0316 00:06:47.382790 4816 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 16 00:06:47 crc kubenswrapper[4816]: W0316 00:06:47.382798 4816 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 16 00:06:47 crc kubenswrapper[4816]: W0316 00:06:47.382805 4816 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 16 00:06:47 crc kubenswrapper[4816]: W0316 00:06:47.382813 4816 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 16 00:06:47 crc kubenswrapper[4816]: W0316 00:06:47.382821 4816 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 16 00:06:47 crc kubenswrapper[4816]: W0316 00:06:47.382831 4816 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 16 00:06:47 crc kubenswrapper[4816]: W0316 00:06:47.382841 4816 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 16 00:06:47 crc kubenswrapper[4816]: W0316 00:06:47.382849 4816 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 16 00:06:47 crc kubenswrapper[4816]: W0316 00:06:47.382857 4816 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 16 00:06:47 crc kubenswrapper[4816]: W0316 00:06:47.382870 4816 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 16 00:06:47 crc kubenswrapper[4816]: W0316 00:06:47.382879 4816 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 16 00:06:47 crc kubenswrapper[4816]: W0316 00:06:47.382887 4816 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 16 00:06:47 crc kubenswrapper[4816]: W0316 00:06:47.382896 4816 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 16 00:06:47 crc kubenswrapper[4816]: W0316 00:06:47.382905 4816 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 16 00:06:47 crc kubenswrapper[4816]: W0316 00:06:47.382913 4816 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 16 00:06:47 crc kubenswrapper[4816]: W0316 00:06:47.382921 4816 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 16 00:06:47 crc kubenswrapper[4816]: W0316 00:06:47.382929 4816 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 16 00:06:47 crc kubenswrapper[4816]: W0316 00:06:47.382937 4816 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 16 00:06:47 crc kubenswrapper[4816]: W0316 00:06:47.382948 4816 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 16 00:06:47 crc kubenswrapper[4816]: W0316 00:06:47.382957 4816 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 16 00:06:47 crc kubenswrapper[4816]: W0316 00:06:47.382966 4816 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 16 00:06:47 crc kubenswrapper[4816]: W0316 00:06:47.382974 4816 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 16 00:06:47 crc kubenswrapper[4816]: W0316 00:06:47.382983 4816 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 16 00:06:47 crc kubenswrapper[4816]: W0316 00:06:47.382992 4816 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 16 00:06:47 crc kubenswrapper[4816]: W0316 00:06:47.383000 4816 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 16 00:06:47 crc kubenswrapper[4816]: W0316 00:06:47.383011 4816 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 16 00:06:47 crc kubenswrapper[4816]: W0316 00:06:47.383021 4816 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 16 00:06:47 crc kubenswrapper[4816]: W0316 00:06:47.383030 4816 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 16 00:06:47 crc kubenswrapper[4816]: W0316 00:06:47.383038 4816 feature_gate.go:330] unrecognized feature gate: Example Mar 16 00:06:47 crc kubenswrapper[4816]: W0316 00:06:47.383046 4816 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 16 00:06:47 crc kubenswrapper[4816]: W0316 00:06:47.383054 4816 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 16 00:06:47 crc kubenswrapper[4816]: W0316 00:06:47.383062 4816 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 16 00:06:47 crc kubenswrapper[4816]: W0316 00:06:47.383070 4816 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 16 00:06:47 crc kubenswrapper[4816]: W0316 00:06:47.383078 4816 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 16 00:06:47 crc kubenswrapper[4816]: W0316 00:06:47.383086 4816 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 16 00:06:47 crc kubenswrapper[4816]: W0316 00:06:47.383093 4816 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 16 00:06:47 crc kubenswrapper[4816]: W0316 00:06:47.383101 4816 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 16 00:06:47 crc kubenswrapper[4816]: W0316 00:06:47.383109 4816 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 16 00:06:47 crc kubenswrapper[4816]: W0316 00:06:47.383118 4816 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 16 00:06:47 crc kubenswrapper[4816]: W0316 00:06:47.383128 4816 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 16 00:06:47 crc kubenswrapper[4816]: W0316 00:06:47.383138 4816 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 16 00:06:47 crc kubenswrapper[4816]: W0316 00:06:47.383149 4816 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 16 00:06:47 crc kubenswrapper[4816]: W0316 00:06:47.383158 4816 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 16 00:06:47 crc kubenswrapper[4816]: W0316 00:06:47.383167 4816 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 16 00:06:47 crc kubenswrapper[4816]: W0316 00:06:47.383175 4816 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 16 00:06:47 crc kubenswrapper[4816]: W0316 00:06:47.383183 4816 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 16 00:06:47 crc kubenswrapper[4816]: W0316 00:06:47.383190 4816 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 16 00:06:47 crc kubenswrapper[4816]: W0316 00:06:47.383198 4816 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 16 00:06:47 crc kubenswrapper[4816]: W0316 00:06:47.383207 4816 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 16 00:06:47 crc kubenswrapper[4816]: W0316 00:06:47.383214 4816 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 16 00:06:47 crc kubenswrapper[4816]: W0316 00:06:47.383222 4816 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.383247 4816 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.398721 4816 server.go:491] "Kubelet version" kubeletVersion="v1.31.5" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.398782 4816 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Mar 16 00:06:47 crc kubenswrapper[4816]: W0316 00:06:47.398942 4816 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 16 00:06:47 crc kubenswrapper[4816]: W0316 00:06:47.398966 4816 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 16 00:06:47 crc kubenswrapper[4816]: W0316 00:06:47.398976 4816 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 16 00:06:47 crc kubenswrapper[4816]: W0316 00:06:47.398987 4816 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 16 00:06:47 crc kubenswrapper[4816]: W0316 00:06:47.398997 4816 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 16 00:06:47 crc kubenswrapper[4816]: W0316 00:06:47.399006 4816 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 16 00:06:47 crc kubenswrapper[4816]: W0316 00:06:47.399014 4816 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 16 00:06:47 crc kubenswrapper[4816]: W0316 00:06:47.399022 4816 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 16 00:06:47 crc kubenswrapper[4816]: W0316 00:06:47.399030 4816 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 16 00:06:47 crc kubenswrapper[4816]: W0316 00:06:47.399038 4816 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 16 00:06:47 crc kubenswrapper[4816]: W0316 00:06:47.399046 4816 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 16 00:06:47 crc kubenswrapper[4816]: W0316 00:06:47.399055 4816 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 16 00:06:47 crc kubenswrapper[4816]: W0316 00:06:47.399063 4816 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 16 00:06:47 crc kubenswrapper[4816]: W0316 00:06:47.399072 4816 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 16 00:06:47 crc kubenswrapper[4816]: W0316 00:06:47.399080 4816 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 16 00:06:47 crc kubenswrapper[4816]: W0316 00:06:47.399089 4816 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 16 00:06:47 crc kubenswrapper[4816]: W0316 00:06:47.399096 4816 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 16 00:06:47 crc kubenswrapper[4816]: W0316 00:06:47.399104 4816 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 16 00:06:47 crc kubenswrapper[4816]: W0316 00:06:47.399112 4816 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 16 00:06:47 crc kubenswrapper[4816]: W0316 00:06:47.399120 4816 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 16 00:06:47 crc kubenswrapper[4816]: W0316 00:06:47.399147 4816 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 16 00:06:47 crc kubenswrapper[4816]: W0316 00:06:47.399156 4816 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 16 00:06:47 crc kubenswrapper[4816]: W0316 00:06:47.399163 4816 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 16 00:06:47 crc kubenswrapper[4816]: W0316 00:06:47.399173 4816 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 16 00:06:47 crc kubenswrapper[4816]: W0316 00:06:47.399183 4816 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 16 00:06:47 crc kubenswrapper[4816]: W0316 00:06:47.399193 4816 feature_gate.go:330] unrecognized feature gate: Example Mar 16 00:06:47 crc kubenswrapper[4816]: W0316 00:06:47.399203 4816 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 16 00:06:47 crc kubenswrapper[4816]: W0316 00:06:47.399212 4816 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 16 00:06:47 crc kubenswrapper[4816]: W0316 00:06:47.399221 4816 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 16 00:06:47 crc kubenswrapper[4816]: W0316 00:06:47.399229 4816 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 16 00:06:47 crc kubenswrapper[4816]: W0316 00:06:47.399237 4816 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 16 00:06:47 crc kubenswrapper[4816]: W0316 00:06:47.399245 4816 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 16 00:06:47 crc kubenswrapper[4816]: W0316 00:06:47.399256 4816 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 16 00:06:47 crc kubenswrapper[4816]: W0316 00:06:47.399269 4816 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 16 00:06:47 crc kubenswrapper[4816]: W0316 00:06:47.399281 4816 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 16 00:06:47 crc kubenswrapper[4816]: W0316 00:06:47.399290 4816 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 16 00:06:47 crc kubenswrapper[4816]: W0316 00:06:47.399298 4816 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 16 00:06:47 crc kubenswrapper[4816]: W0316 00:06:47.399310 4816 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 16 00:06:47 crc kubenswrapper[4816]: W0316 00:06:47.399320 4816 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 16 00:06:47 crc kubenswrapper[4816]: W0316 00:06:47.399328 4816 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 16 00:06:47 crc kubenswrapper[4816]: W0316 00:06:47.399336 4816 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 16 00:06:47 crc kubenswrapper[4816]: W0316 00:06:47.399344 4816 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 16 00:06:47 crc kubenswrapper[4816]: W0316 00:06:47.399352 4816 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 16 00:06:47 crc kubenswrapper[4816]: W0316 00:06:47.399361 4816 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 16 00:06:47 crc kubenswrapper[4816]: W0316 00:06:47.399369 4816 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 16 00:06:47 crc kubenswrapper[4816]: W0316 00:06:47.399378 4816 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 16 00:06:47 crc kubenswrapper[4816]: W0316 00:06:47.399387 4816 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 16 00:06:47 crc kubenswrapper[4816]: W0316 00:06:47.399395 4816 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 16 00:06:47 crc kubenswrapper[4816]: W0316 00:06:47.399404 4816 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 16 00:06:47 crc kubenswrapper[4816]: W0316 00:06:47.399413 4816 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 16 00:06:47 crc kubenswrapper[4816]: W0316 00:06:47.399421 4816 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 16 00:06:47 crc kubenswrapper[4816]: W0316 00:06:47.399429 4816 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 16 00:06:47 crc kubenswrapper[4816]: W0316 00:06:47.399440 4816 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 16 00:06:47 crc kubenswrapper[4816]: W0316 00:06:47.399451 4816 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 16 00:06:47 crc kubenswrapper[4816]: W0316 00:06:47.399461 4816 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 16 00:06:47 crc kubenswrapper[4816]: W0316 00:06:47.399471 4816 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 16 00:06:47 crc kubenswrapper[4816]: W0316 00:06:47.399480 4816 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 16 00:06:47 crc kubenswrapper[4816]: W0316 00:06:47.399488 4816 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 16 00:06:47 crc kubenswrapper[4816]: W0316 00:06:47.399496 4816 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 16 00:06:47 crc kubenswrapper[4816]: W0316 00:06:47.399505 4816 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 16 00:06:47 crc kubenswrapper[4816]: W0316 00:06:47.399515 4816 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 16 00:06:47 crc kubenswrapper[4816]: W0316 00:06:47.399524 4816 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 16 00:06:47 crc kubenswrapper[4816]: W0316 00:06:47.399534 4816 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 16 00:06:47 crc kubenswrapper[4816]: W0316 00:06:47.399542 4816 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 16 00:06:47 crc kubenswrapper[4816]: W0316 00:06:47.399578 4816 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 16 00:06:47 crc kubenswrapper[4816]: W0316 00:06:47.399587 4816 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 16 00:06:47 crc kubenswrapper[4816]: W0316 00:06:47.399595 4816 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 16 00:06:47 crc kubenswrapper[4816]: W0316 00:06:47.399606 4816 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 16 00:06:47 crc kubenswrapper[4816]: W0316 00:06:47.399617 4816 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 16 00:06:47 crc kubenswrapper[4816]: W0316 00:06:47.399627 4816 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 16 00:06:47 crc kubenswrapper[4816]: W0316 00:06:47.399637 4816 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.399651 4816 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Mar 16 00:06:47 crc kubenswrapper[4816]: W0316 00:06:47.399908 4816 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 16 00:06:47 crc kubenswrapper[4816]: W0316 00:06:47.399919 4816 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 16 00:06:47 crc kubenswrapper[4816]: W0316 00:06:47.399929 4816 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 16 00:06:47 crc kubenswrapper[4816]: W0316 00:06:47.399938 4816 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 16 00:06:47 crc kubenswrapper[4816]: W0316 00:06:47.399947 4816 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 16 00:06:47 crc kubenswrapper[4816]: W0316 00:06:47.399958 4816 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 16 00:06:47 crc kubenswrapper[4816]: W0316 00:06:47.399968 4816 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 16 00:06:47 crc kubenswrapper[4816]: W0316 00:06:47.399977 4816 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 16 00:06:47 crc kubenswrapper[4816]: W0316 00:06:47.399986 4816 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 16 00:06:47 crc kubenswrapper[4816]: W0316 00:06:47.399994 4816 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 16 00:06:47 crc kubenswrapper[4816]: W0316 00:06:47.400003 4816 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 16 00:06:47 crc kubenswrapper[4816]: W0316 00:06:47.400012 4816 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 16 00:06:47 crc kubenswrapper[4816]: W0316 00:06:47.400021 4816 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 16 00:06:47 crc kubenswrapper[4816]: W0316 00:06:47.400030 4816 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 16 00:06:47 crc kubenswrapper[4816]: W0316 00:06:47.400038 4816 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 16 00:06:47 crc kubenswrapper[4816]: W0316 00:06:47.400047 4816 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 16 00:06:47 crc kubenswrapper[4816]: W0316 00:06:47.400056 4816 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 16 00:06:47 crc kubenswrapper[4816]: W0316 00:06:47.400064 4816 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 16 00:06:47 crc kubenswrapper[4816]: W0316 00:06:47.400071 4816 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 16 00:06:47 crc kubenswrapper[4816]: W0316 00:06:47.400079 4816 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 16 00:06:47 crc kubenswrapper[4816]: W0316 00:06:47.400087 4816 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 16 00:06:47 crc kubenswrapper[4816]: W0316 00:06:47.400095 4816 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 16 00:06:47 crc kubenswrapper[4816]: W0316 00:06:47.400103 4816 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 16 00:06:47 crc kubenswrapper[4816]: W0316 00:06:47.400111 4816 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 16 00:06:47 crc kubenswrapper[4816]: W0316 00:06:47.400119 4816 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 16 00:06:47 crc kubenswrapper[4816]: W0316 00:06:47.400127 4816 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 16 00:06:47 crc kubenswrapper[4816]: W0316 00:06:47.400136 4816 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 16 00:06:47 crc kubenswrapper[4816]: W0316 00:06:47.400145 4816 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 16 00:06:47 crc kubenswrapper[4816]: W0316 00:06:47.400154 4816 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 16 00:06:47 crc kubenswrapper[4816]: W0316 00:06:47.400162 4816 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 16 00:06:47 crc kubenswrapper[4816]: W0316 00:06:47.400171 4816 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 16 00:06:47 crc kubenswrapper[4816]: W0316 00:06:47.400180 4816 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 16 00:06:47 crc kubenswrapper[4816]: W0316 00:06:47.400189 4816 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 16 00:06:47 crc kubenswrapper[4816]: W0316 00:06:47.400197 4816 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 16 00:06:47 crc kubenswrapper[4816]: W0316 00:06:47.400206 4816 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 16 00:06:47 crc kubenswrapper[4816]: W0316 00:06:47.400214 4816 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 16 00:06:47 crc kubenswrapper[4816]: W0316 00:06:47.400223 4816 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 16 00:06:47 crc kubenswrapper[4816]: W0316 00:06:47.400232 4816 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 16 00:06:47 crc kubenswrapper[4816]: W0316 00:06:47.400242 4816 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 16 00:06:47 crc kubenswrapper[4816]: W0316 00:06:47.400251 4816 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 16 00:06:47 crc kubenswrapper[4816]: W0316 00:06:47.400259 4816 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 16 00:06:47 crc kubenswrapper[4816]: W0316 00:06:47.400267 4816 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 16 00:06:47 crc kubenswrapper[4816]: W0316 00:06:47.400275 4816 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 16 00:06:47 crc kubenswrapper[4816]: W0316 00:06:47.400283 4816 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 16 00:06:47 crc kubenswrapper[4816]: W0316 00:06:47.400291 4816 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 16 00:06:47 crc kubenswrapper[4816]: W0316 00:06:47.400299 4816 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 16 00:06:47 crc kubenswrapper[4816]: W0316 00:06:47.400308 4816 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 16 00:06:47 crc kubenswrapper[4816]: W0316 00:06:47.400317 4816 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 16 00:06:47 crc kubenswrapper[4816]: W0316 00:06:47.400325 4816 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 16 00:06:47 crc kubenswrapper[4816]: W0316 00:06:47.400333 4816 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 16 00:06:47 crc kubenswrapper[4816]: W0316 00:06:47.400341 4816 feature_gate.go:330] unrecognized feature gate: Example Mar 16 00:06:47 crc kubenswrapper[4816]: W0316 00:06:47.400349 4816 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 16 00:06:47 crc kubenswrapper[4816]: W0316 00:06:47.400360 4816 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 16 00:06:47 crc kubenswrapper[4816]: W0316 00:06:47.400370 4816 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 16 00:06:47 crc kubenswrapper[4816]: W0316 00:06:47.400379 4816 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 16 00:06:47 crc kubenswrapper[4816]: W0316 00:06:47.400389 4816 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 16 00:06:47 crc kubenswrapper[4816]: W0316 00:06:47.400399 4816 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 16 00:06:47 crc kubenswrapper[4816]: W0316 00:06:47.400409 4816 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 16 00:06:47 crc kubenswrapper[4816]: W0316 00:06:47.400418 4816 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 16 00:06:47 crc kubenswrapper[4816]: W0316 00:06:47.400427 4816 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 16 00:06:47 crc kubenswrapper[4816]: W0316 00:06:47.400436 4816 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 16 00:06:47 crc kubenswrapper[4816]: W0316 00:06:47.400445 4816 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 16 00:06:47 crc kubenswrapper[4816]: W0316 00:06:47.400454 4816 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 16 00:06:47 crc kubenswrapper[4816]: W0316 00:06:47.400463 4816 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 16 00:06:47 crc kubenswrapper[4816]: W0316 00:06:47.400471 4816 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 16 00:06:47 crc kubenswrapper[4816]: W0316 00:06:47.400479 4816 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 16 00:06:47 crc kubenswrapper[4816]: W0316 00:06:47.400487 4816 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 16 00:06:47 crc kubenswrapper[4816]: W0316 00:06:47.400495 4816 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 16 00:06:47 crc kubenswrapper[4816]: W0316 00:06:47.400504 4816 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 16 00:06:47 crc kubenswrapper[4816]: W0316 00:06:47.400512 4816 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 16 00:06:47 crc kubenswrapper[4816]: W0316 00:06:47.400521 4816 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.400534 4816 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.400953 4816 server.go:940] "Client rotation is on, will bootstrap in background" Mar 16 00:06:47 crc kubenswrapper[4816]: E0316 00:06:47.405871 4816 bootstrap.go:266] "Unhandled Error" err="part of the existing bootstrap client certificate in /var/lib/kubelet/kubeconfig is expired: 2026-02-24 05:52:08 +0000 UTC" logger="UnhandledError" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.410352 4816 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.410492 4816 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.412405 4816 server.go:997] "Starting client certificate rotation" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.412448 4816 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate rotation is enabled Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.412625 4816 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.442011 4816 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Mar 16 00:06:47 crc kubenswrapper[4816]: E0316 00:06:47.445789 4816 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 38.102.83.158:6443: connect: connection refused" logger="UnhandledError" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.447780 4816 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.471936 4816 log.go:25] "Validated CRI v1 runtime API" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.511682 4816 log.go:25] "Validated CRI v1 image API" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.514382 4816 server.go:1437] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.521713 4816 fs.go:133] Filesystem UUIDs: map[0b076daa-c26a-46d2-b3a6-72a8dbc6e257:/dev/vda4 2026-03-16-00-00-47-00:/dev/sr0 7B77-95E7:/dev/vda2 de0497b0-db1b-465a-b278-03db02455c71:/dev/vda3] Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.521757 4816 fs.go:134] Filesystem partitions: map[/dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /dev/vda3:{mountpoint:/boot major:252 minor:3 fsType:ext4 blockSize:0} /dev/vda4:{mountpoint:/var major:252 minor:4 fsType:xfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /run/user/1000:{mountpoint:/run/user/1000 major:0 minor:42 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:30 fsType:tmpfs blockSize:0} /var/lib/etcd:{mountpoint:/var/lib/etcd major:0 minor:43 fsType:tmpfs blockSize:0}] Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.550820 4816 manager.go:217] Machine: {Timestamp:2026-03-16 00:06:47.548429085 +0000 UTC m=+0.644729098 CPUVendorID:AuthenticAMD NumCores:12 NumPhysicalCores:1 NumSockets:12 CpuFrequency:2800000 MemoryCapacity:33654124544 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:21801e6708c44f15b81395eb736a7cec SystemUUID:e97abf98-298f-4589-a70a-4cfb5cb2994a BootID:d8fb348c-4907-4c8f-859a-735976530e03 Filesystems:[{Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6730825728 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/vda4 DeviceMajor:252 DeviceMinor:4 Capacity:85292941312 Type:vfs Inodes:41679680 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:30 Capacity:16827064320 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/vda3 DeviceMajor:252 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/run/user/1000 DeviceMajor:0 DeviceMinor:42 Capacity:3365408768 Type:vfs Inodes:821633 HasInodes:true} {Device:/var/lib/etcd DeviceMajor:0 DeviceMinor:43 Capacity:1073741824 Type:vfs Inodes:4108169 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16827060224 Type:vfs Inodes:4108169 HasInodes:true}] DiskMap:map[252:0:{Name:vda Major:252 Minor:0 Size:214748364800 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:fa:16:3e:9f:f8:54 Speed:0 Mtu:1500} {Name:br-int MacAddress:d6:39:55:2e:22:71 Speed:0 Mtu:1400} {Name:ens3 MacAddress:fa:16:3e:9f:f8:54 Speed:-1 Mtu:1500} {Name:ens7 MacAddress:fa:16:3e:00:56:52 Speed:-1 Mtu:1500} {Name:ens7.20 MacAddress:52:54:00:46:41:e3 Speed:-1 Mtu:1496} {Name:ens7.21 MacAddress:52:54:00:d1:37:59 Speed:-1 Mtu:1496} {Name:ens7.22 MacAddress:52:54:00:5d:a7:05 Speed:-1 Mtu:1496} {Name:eth10 MacAddress:ce:bb:82:53:ac:00 Speed:0 Mtu:1500} {Name:ovn-k8s-mp0 MacAddress:0a:58:0a:d9:00:02 Speed:0 Mtu:1400} {Name:ovs-system MacAddress:8e:84:84:1c:99:9a Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33654124544 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:0 Size:16777216 Type:Unified Level:3}] SocketID:0 BookID: DrawerID:} {Id:0 Threads:[1] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:1 Size:16777216 Type:Unified Level:3}] SocketID:1 BookID: DrawerID:} {Id:0 Threads:[10] Caches:[{Id:10 Size:32768 Type:Data Level:1} {Id:10 Size:32768 Type:Instruction Level:1} {Id:10 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:10 Size:16777216 Type:Unified Level:3}] SocketID:10 BookID: DrawerID:} {Id:0 Threads:[11] Caches:[{Id:11 Size:32768 Type:Data Level:1} {Id:11 Size:32768 Type:Instruction Level:1} {Id:11 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:11 Size:16777216 Type:Unified Level:3}] SocketID:11 BookID: DrawerID:} {Id:0 Threads:[2] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:2 Size:16777216 Type:Unified Level:3}] SocketID:2 BookID: DrawerID:} {Id:0 Threads:[3] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:3 Size:16777216 Type:Unified Level:3}] SocketID:3 BookID: DrawerID:} {Id:0 Threads:[4] Caches:[{Id:4 Size:32768 Type:Data Level:1} {Id:4 Size:32768 Type:Instruction Level:1} {Id:4 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:4 Size:16777216 Type:Unified Level:3}] SocketID:4 BookID: DrawerID:} {Id:0 Threads:[5] Caches:[{Id:5 Size:32768 Type:Data Level:1} {Id:5 Size:32768 Type:Instruction Level:1} {Id:5 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:5 Size:16777216 Type:Unified Level:3}] SocketID:5 BookID: DrawerID:} {Id:0 Threads:[6] Caches:[{Id:6 Size:32768 Type:Data Level:1} {Id:6 Size:32768 Type:Instruction Level:1} {Id:6 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:6 Size:16777216 Type:Unified Level:3}] SocketID:6 BookID: DrawerID:} {Id:0 Threads:[7] Caches:[{Id:7 Size:32768 Type:Data Level:1} {Id:7 Size:32768 Type:Instruction Level:1} {Id:7 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:7 Size:16777216 Type:Unified Level:3}] SocketID:7 BookID: DrawerID:} {Id:0 Threads:[8] Caches:[{Id:8 Size:32768 Type:Data Level:1} {Id:8 Size:32768 Type:Instruction Level:1} {Id:8 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:8 Size:16777216 Type:Unified Level:3}] SocketID:8 BookID: DrawerID:} {Id:0 Threads:[9] Caches:[{Id:9 Size:32768 Type:Data Level:1} {Id:9 Size:32768 Type:Instruction Level:1} {Id:9 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:9 Size:16777216 Type:Unified Level:3}] SocketID:9 BookID: DrawerID:}] Caches:[] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.551607 4816 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.551863 4816 manager.go:233] Version: {KernelVersion:5.14.0-427.50.2.el9_4.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 418.94.202502100215-0 DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.554356 4816 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.554691 4816 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.554749 4816 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"crc","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"200m","ephemeral-storage":"350Mi","memory":"350Mi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.555082 4816 topology_manager.go:138] "Creating topology manager with none policy" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.555103 4816 container_manager_linux.go:303] "Creating device plugin manager" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.555813 4816 manager.go:142] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.555865 4816 server.go:66] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.556361 4816 state_mem.go:36] "Initialized new in-memory state store" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.556910 4816 server.go:1245] "Using root directory" path="/var/lib/kubelet" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.562359 4816 kubelet.go:418] "Attempting to sync node with API server" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.562394 4816 kubelet.go:313] "Adding static pod path" path="/etc/kubernetes/manifests" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.562420 4816 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.562442 4816 kubelet.go:324] "Adding apiserver pod source" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.562461 4816 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.567138 4816 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="cri-o" version="1.31.5-4.rhaos4.18.gitdad78d5.el9" apiVersion="v1" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.568614 4816 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-server-current.pem". Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.571029 4816 kubelet.go:854] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Mar 16 00:06:47 crc kubenswrapper[4816]: W0316 00:06:47.571218 4816 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.158:6443: connect: connection refused Mar 16 00:06:47 crc kubenswrapper[4816]: W0316 00:06:47.571302 4816 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.158:6443: connect: connection refused Mar 16 00:06:47 crc kubenswrapper[4816]: E0316 00:06:47.571371 4816 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.158:6443: connect: connection refused" logger="UnhandledError" Mar 16 00:06:47 crc kubenswrapper[4816]: E0316 00:06:47.571410 4816 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.158:6443: connect: connection refused" logger="UnhandledError" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.572926 4816 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.572975 4816 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.572991 4816 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.573006 4816 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.573028 4816 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.573043 4816 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/secret" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.573057 4816 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.573083 4816 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.573101 4816 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/fc" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.573121 4816 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.573145 4816 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/projected" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.573161 4816 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.573208 4816 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/csi" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.574068 4816 server.go:1280] "Started kubelet" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.574219 4816 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.574453 4816 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.575686 4816 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Mar 16 00:06:47 crc systemd[1]: Started Kubernetes Kubelet. Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.577774 4816 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.158:6443: connect: connection refused Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.585329 4816 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate rotation is enabled Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.587874 4816 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.590165 4816 volume_manager.go:287] "The desired_state_of_world populator starts" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.590187 4816 volume_manager.go:289] "Starting Kubelet Volume Manager" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.590274 4816 server.go:460] "Adding debug handlers to kubelet server" Mar 16 00:06:47 crc kubenswrapper[4816]: E0316 00:06:47.590368 4816 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 16 00:06:47 crc kubenswrapper[4816]: E0316 00:06:47.588868 4816 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 38.102.83.158:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.189d29a2d41dfe7d default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-16 00:06:47.574019709 +0000 UTC m=+0.670319702,LastTimestamp:2026-03-16 00:06:47.574019709 +0000 UTC m=+0.670319702,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.592093 4816 desired_state_of_world_populator.go:146] "Desired state populator starts to run" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.592449 4816 factory.go:55] Registering systemd factory Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.592481 4816 factory.go:221] Registration of the systemd container factory successfully Mar 16 00:06:47 crc kubenswrapper[4816]: W0316 00:06:47.595413 4816 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.158:6443: connect: connection refused Mar 16 00:06:47 crc kubenswrapper[4816]: E0316 00:06:47.595521 4816 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.158:6443: connect: connection refused" logger="UnhandledError" Mar 16 00:06:47 crc kubenswrapper[4816]: E0316 00:06:47.595966 4816 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.158:6443: connect: connection refused" interval="200ms" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.600105 4816 factory.go:153] Registering CRI-O factory Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.600159 4816 factory.go:221] Registration of the crio container factory successfully Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.600281 4816 factory.go:219] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.600320 4816 factory.go:103] Registering Raw factory Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.600349 4816 manager.go:1196] Started watching for new ooms in manager Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.602329 4816 manager.go:319] Starting recovery of all containers Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.608720 4816 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" seLinuxMountContext="" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.608954 4816 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" seLinuxMountContext="" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.611157 4816 reconstruct.go:144] "Volume is marked device as uncertain and added into the actual state" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" deviceMountPath="/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.611357 4816 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" seLinuxMountContext="" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.611496 4816 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" seLinuxMountContext="" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.611672 4816 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" seLinuxMountContext="" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.611794 4816 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" seLinuxMountContext="" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.611910 4816 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" seLinuxMountContext="" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.612059 4816 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" seLinuxMountContext="" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.612182 4816 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" seLinuxMountContext="" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.612301 4816 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" seLinuxMountContext="" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.612415 4816 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" seLinuxMountContext="" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.612528 4816 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" seLinuxMountContext="" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.612684 4816 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" seLinuxMountContext="" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.612858 4816 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" seLinuxMountContext="" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.612979 4816 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" seLinuxMountContext="" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.613089 4816 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" seLinuxMountContext="" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.613202 4816 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" seLinuxMountContext="" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.613318 4816 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" seLinuxMountContext="" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.613442 4816 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" seLinuxMountContext="" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.613583 4816 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" seLinuxMountContext="" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.613731 4816 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" seLinuxMountContext="" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.613847 4816 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" seLinuxMountContext="" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.613955 4816 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" seLinuxMountContext="" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.614083 4816 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" seLinuxMountContext="" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.614195 4816 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" seLinuxMountContext="" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.614303 4816 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" seLinuxMountContext="" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.614482 4816 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" seLinuxMountContext="" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.614691 4816 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" seLinuxMountContext="" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.614883 4816 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" seLinuxMountContext="" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.615024 4816 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" seLinuxMountContext="" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.615172 4816 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" seLinuxMountContext="" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.615337 4816 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" seLinuxMountContext="" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.615457 4816 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" seLinuxMountContext="" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.615664 4816 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" seLinuxMountContext="" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.615800 4816 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" seLinuxMountContext="" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.615930 4816 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" seLinuxMountContext="" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.616046 4816 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" seLinuxMountContext="" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.616157 4816 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" seLinuxMountContext="" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.616268 4816 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" seLinuxMountContext="" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.616388 4816 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" seLinuxMountContext="" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.616504 4816 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" seLinuxMountContext="" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.616652 4816 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" seLinuxMountContext="" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.616914 4816 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" seLinuxMountContext="" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.617039 4816 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" volumeName="kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" seLinuxMountContext="" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.617164 4816 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" seLinuxMountContext="" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.617288 4816 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" seLinuxMountContext="" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.617417 4816 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" seLinuxMountContext="" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.617531 4816 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" seLinuxMountContext="" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.617685 4816 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" seLinuxMountContext="" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.617802 4816 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" seLinuxMountContext="" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.617931 4816 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" seLinuxMountContext="" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.618045 4816 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" seLinuxMountContext="" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.618166 4816 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" seLinuxMountContext="" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.618288 4816 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" seLinuxMountContext="" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.618401 4816 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" seLinuxMountContext="" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.618610 4816 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" seLinuxMountContext="" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.618737 4816 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" seLinuxMountContext="" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.618850 4816 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert" seLinuxMountContext="" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.618963 4816 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" seLinuxMountContext="" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.619073 4816 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" seLinuxMountContext="" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.619198 4816 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" seLinuxMountContext="" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.619328 4816 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" seLinuxMountContext="" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.619451 4816 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" seLinuxMountContext="" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.619884 4816 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" seLinuxMountContext="" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.620015 4816 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3b6479f0-333b-4a96-9adf-2099afdc2447" volumeName="kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr" seLinuxMountContext="" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.620127 4816 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" seLinuxMountContext="" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.620261 4816 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" seLinuxMountContext="" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.621287 4816 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" seLinuxMountContext="" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.621352 4816 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" seLinuxMountContext="" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.621381 4816 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" seLinuxMountContext="" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.621409 4816 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" seLinuxMountContext="" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.621433 4816 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" seLinuxMountContext="" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.621459 4816 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" seLinuxMountContext="" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.621483 4816 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" seLinuxMountContext="" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.621504 4816 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf" seLinuxMountContext="" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.621529 4816 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" seLinuxMountContext="" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.621582 4816 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" seLinuxMountContext="" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.621607 4816 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" seLinuxMountContext="" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.621630 4816 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" seLinuxMountContext="" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.621652 4816 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" seLinuxMountContext="" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.621676 4816 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" seLinuxMountContext="" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.621699 4816 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" seLinuxMountContext="" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.621723 4816 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" seLinuxMountContext="" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.621746 4816 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" seLinuxMountContext="" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.621770 4816 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" seLinuxMountContext="" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.621792 4816 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" seLinuxMountContext="" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.621818 4816 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" seLinuxMountContext="" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.621839 4816 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" seLinuxMountContext="" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.621863 4816 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" seLinuxMountContext="" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.621885 4816 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49ef4625-1d3a-4a9f-b595-c2433d32326d" volumeName="kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" seLinuxMountContext="" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.621906 4816 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" seLinuxMountContext="" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.621926 4816 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" seLinuxMountContext="" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.621951 4816 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" seLinuxMountContext="" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.621975 4816 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" seLinuxMountContext="" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.622007 4816 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" seLinuxMountContext="" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.622037 4816 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" seLinuxMountContext="" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.622062 4816 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" seLinuxMountContext="" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.622085 4816 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" seLinuxMountContext="" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.622109 4816 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" seLinuxMountContext="" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.622132 4816 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" seLinuxMountContext="" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.622154 4816 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" seLinuxMountContext="" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.622175 4816 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm" seLinuxMountContext="" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.622196 4816 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" seLinuxMountContext="" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.622226 4816 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" seLinuxMountContext="" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.622272 4816 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert" seLinuxMountContext="" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.622371 4816 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" seLinuxMountContext="" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.622399 4816 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides" seLinuxMountContext="" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.622430 4816 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5" seLinuxMountContext="" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.622459 4816 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" seLinuxMountContext="" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.622487 4816 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" seLinuxMountContext="" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.622513 4816 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" seLinuxMountContext="" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.622538 4816 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" seLinuxMountContext="" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.622590 4816 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" seLinuxMountContext="" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.622614 4816 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" seLinuxMountContext="" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.622636 4816 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" seLinuxMountContext="" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.622656 4816 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="44663579-783b-4372-86d6-acf235a62d72" volumeName="kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" seLinuxMountContext="" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.622678 4816 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" seLinuxMountContext="" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.622698 4816 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" seLinuxMountContext="" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.622717 4816 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" seLinuxMountContext="" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.622738 4816 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" seLinuxMountContext="" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.622758 4816 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" seLinuxMountContext="" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.622778 4816 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" seLinuxMountContext="" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.622799 4816 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls" seLinuxMountContext="" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.622818 4816 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" seLinuxMountContext="" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.622842 4816 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" seLinuxMountContext="" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.622861 4816 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" seLinuxMountContext="" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.622882 4816 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" seLinuxMountContext="" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.622905 4816 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" seLinuxMountContext="" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.622927 4816 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" seLinuxMountContext="" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.622947 4816 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" seLinuxMountContext="" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.622966 4816 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" seLinuxMountContext="" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.622985 4816 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" seLinuxMountContext="" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.623004 4816 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" seLinuxMountContext="" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.623027 4816 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" seLinuxMountContext="" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.623054 4816 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" seLinuxMountContext="" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.623082 4816 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" seLinuxMountContext="" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.623103 4816 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" seLinuxMountContext="" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.623129 4816 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" seLinuxMountContext="" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.623153 4816 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" seLinuxMountContext="" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.623172 4816 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" seLinuxMountContext="" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.623198 4816 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" seLinuxMountContext="" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.623220 4816 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" seLinuxMountContext="" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.623244 4816 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" seLinuxMountContext="" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.623265 4816 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" seLinuxMountContext="" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.623286 4816 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" seLinuxMountContext="" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.623310 4816 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb" seLinuxMountContext="" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.623329 4816 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" seLinuxMountContext="" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.623349 4816 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" seLinuxMountContext="" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.623370 4816 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" seLinuxMountContext="" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.623390 4816 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" seLinuxMountContext="" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.623411 4816 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" seLinuxMountContext="" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.623432 4816 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" seLinuxMountContext="" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.623452 4816 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" seLinuxMountContext="" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.623471 4816 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" seLinuxMountContext="" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.623492 4816 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" seLinuxMountContext="" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.623511 4816 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" seLinuxMountContext="" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.623536 4816 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" seLinuxMountContext="" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.623584 4816 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" seLinuxMountContext="" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.623605 4816 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" volumeName="kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" seLinuxMountContext="" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.623629 4816 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" seLinuxMountContext="" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.623651 4816 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" seLinuxMountContext="" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.623670 4816 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" seLinuxMountContext="" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.623690 4816 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" seLinuxMountContext="" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.623711 4816 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" seLinuxMountContext="" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.623731 4816 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" seLinuxMountContext="" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.623752 4816 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" seLinuxMountContext="" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.623774 4816 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" seLinuxMountContext="" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.623794 4816 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" seLinuxMountContext="" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.623816 4816 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" seLinuxMountContext="" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.623841 4816 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" seLinuxMountContext="" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.623860 4816 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" seLinuxMountContext="" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.623881 4816 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" seLinuxMountContext="" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.623904 4816 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" seLinuxMountContext="" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.623924 4816 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" seLinuxMountContext="" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.623946 4816 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" seLinuxMountContext="" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.623966 4816 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" seLinuxMountContext="" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.623987 4816 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" seLinuxMountContext="" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.624009 4816 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" seLinuxMountContext="" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.624042 4816 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" seLinuxMountContext="" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.624093 4816 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script" seLinuxMountContext="" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.624121 4816 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf" seLinuxMountContext="" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.624147 4816 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" seLinuxMountContext="" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.624171 4816 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" seLinuxMountContext="" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.624198 4816 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" seLinuxMountContext="" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.624232 4816 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" seLinuxMountContext="" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.624269 4816 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" seLinuxMountContext="" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.624291 4816 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" seLinuxMountContext="" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.624327 4816 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" seLinuxMountContext="" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.624349 4816 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" seLinuxMountContext="" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.624378 4816 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" seLinuxMountContext="" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.624396 4816 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" seLinuxMountContext="" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.624414 4816 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d751cbb-f2e2-430d-9754-c882a5e924a5" volumeName="kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl" seLinuxMountContext="" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.624434 4816 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" seLinuxMountContext="" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.624456 4816 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" seLinuxMountContext="" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.624475 4816 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" seLinuxMountContext="" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.624509 4816 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" seLinuxMountContext="" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.624529 4816 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" seLinuxMountContext="" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.624633 4816 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" seLinuxMountContext="" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.624655 4816 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" seLinuxMountContext="" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.624674 4816 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" seLinuxMountContext="" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.624694 4816 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" seLinuxMountContext="" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.624720 4816 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" seLinuxMountContext="" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.624740 4816 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" seLinuxMountContext="" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.624761 4816 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" seLinuxMountContext="" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.624810 4816 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" seLinuxMountContext="" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.624833 4816 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" seLinuxMountContext="" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.624854 4816 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" seLinuxMountContext="" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.624874 4816 reconstruct.go:97] "Volume reconstruction finished" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.624890 4816 reconciler.go:26] "Reconciler: start to sync state" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.630459 4816 manager.go:324] Recovery completed Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.651953 4816 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.654760 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.654832 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.654855 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.656766 4816 cpu_manager.go:225] "Starting CPU manager" policy="none" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.656828 4816 cpu_manager.go:226] "Reconciling" reconcilePeriod="10s" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.656869 4816 state_mem.go:36] "Initialized new in-memory state store" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.663676 4816 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.666331 4816 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.666377 4816 status_manager.go:217] "Starting to sync pod status with apiserver" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.666417 4816 kubelet.go:2335] "Starting kubelet main sync loop" Mar 16 00:06:47 crc kubenswrapper[4816]: E0316 00:06:47.666482 4816 kubelet.go:2359] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Mar 16 00:06:47 crc kubenswrapper[4816]: W0316 00:06:47.667372 4816 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.158:6443: connect: connection refused Mar 16 00:06:47 crc kubenswrapper[4816]: E0316 00:06:47.667441 4816 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.158:6443: connect: connection refused" logger="UnhandledError" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.677793 4816 policy_none.go:49] "None policy: Start" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.679172 4816 memory_manager.go:170] "Starting memorymanager" policy="None" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.679222 4816 state_mem.go:35] "Initializing new in-memory state store" Mar 16 00:06:47 crc kubenswrapper[4816]: E0316 00:06:47.690505 4816 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.754579 4816 manager.go:334] "Starting Device Plugin manager" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.754667 4816 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.754691 4816 server.go:79] "Starting device plugin registration server" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.755442 4816 eviction_manager.go:189] "Eviction manager: starting control loop" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.755477 4816 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.755639 4816 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.755831 4816 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.755855 4816 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.766728 4816 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-controller-manager/kube-controller-manager-crc","openshift-kube-scheduler/openshift-kube-scheduler-crc","openshift-machine-config-operator/kube-rbac-proxy-crio-crc","openshift-etcd/etcd-crc","openshift-kube-apiserver/kube-apiserver-crc"] Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.766879 4816 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.768446 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.768497 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.768513 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.768719 4816 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 16 00:06:47 crc kubenswrapper[4816]: E0316 00:06:47.769082 4816 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.769095 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.769204 4816 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.770247 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.770296 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.770316 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.770588 4816 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.770737 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.770859 4816 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.771445 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.771495 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.771516 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.771887 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.771943 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.771969 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.772184 4816 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.772396 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.772468 4816 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.772831 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.772895 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.772909 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.773475 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.773538 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.773595 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.773898 4816 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.774035 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.774096 4816 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.774215 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.774277 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.774298 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.775007 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.775053 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.775074 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.775212 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.775252 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.775272 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.775450 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.775516 4816 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.776929 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.776985 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.777002 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:06:47 crc kubenswrapper[4816]: E0316 00:06:47.797312 4816 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.158:6443: connect: connection refused" interval="400ms" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.831485 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.831570 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.831594 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.831616 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.831637 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.831712 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.831782 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.831862 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.831906 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.831987 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.832037 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.832071 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.832095 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.832116 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.832144 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.857361 4816 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.859473 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.859525 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.859543 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.859608 4816 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 16 00:06:47 crc kubenswrapper[4816]: E0316 00:06:47.860231 4816 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.158:6443: connect: connection refused" node="crc" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.933105 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.933181 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.933220 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.933259 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.933291 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.933320 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.933347 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.933376 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.933390 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.933459 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.933499 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.933410 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.933474 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.933588 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.933516 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.933595 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.933618 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.933532 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.933665 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.933700 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.933733 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.933389 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.933746 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.933744 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.933765 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.933795 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.933798 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.933816 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.933819 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 16 00:06:47 crc kubenswrapper[4816]: I0316 00:06:47.933840 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 16 00:06:48 crc kubenswrapper[4816]: I0316 00:06:48.060693 4816 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 16 00:06:48 crc kubenswrapper[4816]: I0316 00:06:48.062790 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:06:48 crc kubenswrapper[4816]: I0316 00:06:48.062858 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:06:48 crc kubenswrapper[4816]: I0316 00:06:48.062883 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:06:48 crc kubenswrapper[4816]: I0316 00:06:48.062920 4816 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 16 00:06:48 crc kubenswrapper[4816]: E0316 00:06:48.063657 4816 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.158:6443: connect: connection refused" node="crc" Mar 16 00:06:48 crc kubenswrapper[4816]: I0316 00:06:48.100027 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 16 00:06:48 crc kubenswrapper[4816]: I0316 00:06:48.118056 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 16 00:06:48 crc kubenswrapper[4816]: I0316 00:06:48.136072 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 16 00:06:48 crc kubenswrapper[4816]: I0316 00:06:48.157856 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Mar 16 00:06:48 crc kubenswrapper[4816]: W0316 00:06:48.161492 4816 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf614b9022728cf315e60c057852e563e.slice/crio-7a795f904b7275990b6d69977aa116f32c3ac88415844e2c54ed885536e7a489 WatchSource:0}: Error finding container 7a795f904b7275990b6d69977aa116f32c3ac88415844e2c54ed885536e7a489: Status 404 returned error can't find the container with id 7a795f904b7275990b6d69977aa116f32c3ac88415844e2c54ed885536e7a489 Mar 16 00:06:48 crc kubenswrapper[4816]: W0316 00:06:48.163849 4816 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3dcd261975c3d6b9a6ad6367fd4facd3.slice/crio-ce5ce1a0aeb4cd08bb80224cf0434fcf51be56e46e93b4b6faefdfe229ae8322 WatchSource:0}: Error finding container ce5ce1a0aeb4cd08bb80224cf0434fcf51be56e46e93b4b6faefdfe229ae8322: Status 404 returned error can't find the container with id ce5ce1a0aeb4cd08bb80224cf0434fcf51be56e46e93b4b6faefdfe229ae8322 Mar 16 00:06:48 crc kubenswrapper[4816]: I0316 00:06:48.164101 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 16 00:06:48 crc kubenswrapper[4816]: W0316 00:06:48.166816 4816 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd1b160f5dda77d281dd8e69ec8d817f9.slice/crio-482d8e39eee76a31439e56f2dedbb02d839f4b98a5e34d1f81efc30484d830ad WatchSource:0}: Error finding container 482d8e39eee76a31439e56f2dedbb02d839f4b98a5e34d1f81efc30484d830ad: Status 404 returned error can't find the container with id 482d8e39eee76a31439e56f2dedbb02d839f4b98a5e34d1f81efc30484d830ad Mar 16 00:06:48 crc kubenswrapper[4816]: W0316 00:06:48.179135 4816 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2139d3e2895fc6797b9c76a1b4c9886d.slice/crio-c9472f6930ca03eb7f0d6ed9ed055f1c5417c4c18dc7a40e0bf7e696081b3759 WatchSource:0}: Error finding container c9472f6930ca03eb7f0d6ed9ed055f1c5417c4c18dc7a40e0bf7e696081b3759: Status 404 returned error can't find the container with id c9472f6930ca03eb7f0d6ed9ed055f1c5417c4c18dc7a40e0bf7e696081b3759 Mar 16 00:06:48 crc kubenswrapper[4816]: W0316 00:06:48.194387 4816 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf4b27818a5e8e43d0dc095d08835c792.slice/crio-f53fc90587252cbf8d63a5fe93b775bc4cb57e00d1a9762e32c76a654561ee2f WatchSource:0}: Error finding container f53fc90587252cbf8d63a5fe93b775bc4cb57e00d1a9762e32c76a654561ee2f: Status 404 returned error can't find the container with id f53fc90587252cbf8d63a5fe93b775bc4cb57e00d1a9762e32c76a654561ee2f Mar 16 00:06:48 crc kubenswrapper[4816]: E0316 00:06:48.198256 4816 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.158:6443: connect: connection refused" interval="800ms" Mar 16 00:06:48 crc kubenswrapper[4816]: I0316 00:06:48.464531 4816 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 16 00:06:48 crc kubenswrapper[4816]: I0316 00:06:48.466546 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:06:48 crc kubenswrapper[4816]: I0316 00:06:48.466623 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:06:48 crc kubenswrapper[4816]: I0316 00:06:48.466638 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:06:48 crc kubenswrapper[4816]: I0316 00:06:48.466672 4816 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 16 00:06:48 crc kubenswrapper[4816]: E0316 00:06:48.467047 4816 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.158:6443: connect: connection refused" node="crc" Mar 16 00:06:48 crc kubenswrapper[4816]: I0316 00:06:48.578707 4816 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.158:6443: connect: connection refused Mar 16 00:06:48 crc kubenswrapper[4816]: I0316 00:06:48.673375 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"c9472f6930ca03eb7f0d6ed9ed055f1c5417c4c18dc7a40e0bf7e696081b3759"} Mar 16 00:06:48 crc kubenswrapper[4816]: I0316 00:06:48.674918 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"482d8e39eee76a31439e56f2dedbb02d839f4b98a5e34d1f81efc30484d830ad"} Mar 16 00:06:48 crc kubenswrapper[4816]: I0316 00:06:48.676084 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"ce5ce1a0aeb4cd08bb80224cf0434fcf51be56e46e93b4b6faefdfe229ae8322"} Mar 16 00:06:48 crc kubenswrapper[4816]: I0316 00:06:48.677434 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"7a795f904b7275990b6d69977aa116f32c3ac88415844e2c54ed885536e7a489"} Mar 16 00:06:48 crc kubenswrapper[4816]: I0316 00:06:48.678661 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"f53fc90587252cbf8d63a5fe93b775bc4cb57e00d1a9762e32c76a654561ee2f"} Mar 16 00:06:48 crc kubenswrapper[4816]: W0316 00:06:48.700257 4816 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.158:6443: connect: connection refused Mar 16 00:06:48 crc kubenswrapper[4816]: E0316 00:06:48.700320 4816 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.158:6443: connect: connection refused" logger="UnhandledError" Mar 16 00:06:48 crc kubenswrapper[4816]: W0316 00:06:48.864072 4816 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.158:6443: connect: connection refused Mar 16 00:06:48 crc kubenswrapper[4816]: E0316 00:06:48.864211 4816 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.158:6443: connect: connection refused" logger="UnhandledError" Mar 16 00:06:48 crc kubenswrapper[4816]: W0316 00:06:48.905196 4816 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.158:6443: connect: connection refused Mar 16 00:06:48 crc kubenswrapper[4816]: E0316 00:06:48.905358 4816 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.158:6443: connect: connection refused" logger="UnhandledError" Mar 16 00:06:48 crc kubenswrapper[4816]: E0316 00:06:48.999734 4816 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.158:6443: connect: connection refused" interval="1.6s" Mar 16 00:06:49 crc kubenswrapper[4816]: W0316 00:06:49.182376 4816 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.158:6443: connect: connection refused Mar 16 00:06:49 crc kubenswrapper[4816]: E0316 00:06:49.182489 4816 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.158:6443: connect: connection refused" logger="UnhandledError" Mar 16 00:06:49 crc kubenswrapper[4816]: I0316 00:06:49.267366 4816 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 16 00:06:49 crc kubenswrapper[4816]: I0316 00:06:49.269502 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:06:49 crc kubenswrapper[4816]: I0316 00:06:49.269574 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:06:49 crc kubenswrapper[4816]: I0316 00:06:49.269588 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:06:49 crc kubenswrapper[4816]: I0316 00:06:49.269656 4816 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 16 00:06:49 crc kubenswrapper[4816]: E0316 00:06:49.270256 4816 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.158:6443: connect: connection refused" node="crc" Mar 16 00:06:49 crc kubenswrapper[4816]: I0316 00:06:49.508622 4816 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 16 00:06:49 crc kubenswrapper[4816]: E0316 00:06:49.509790 4816 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 38.102.83.158:6443: connect: connection refused" logger="UnhandledError" Mar 16 00:06:49 crc kubenswrapper[4816]: I0316 00:06:49.578973 4816 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.158:6443: connect: connection refused Mar 16 00:06:49 crc kubenswrapper[4816]: I0316 00:06:49.685659 4816 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="0e39d816de31fa1c42a254ec61527496e7e85121f67007a774524167b1063cf1" exitCode=0 Mar 16 00:06:49 crc kubenswrapper[4816]: I0316 00:06:49.685737 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"0e39d816de31fa1c42a254ec61527496e7e85121f67007a774524167b1063cf1"} Mar 16 00:06:49 crc kubenswrapper[4816]: I0316 00:06:49.686006 4816 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 16 00:06:49 crc kubenswrapper[4816]: I0316 00:06:49.687570 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:06:49 crc kubenswrapper[4816]: I0316 00:06:49.687618 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:06:49 crc kubenswrapper[4816]: I0316 00:06:49.687635 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:06:49 crc kubenswrapper[4816]: I0316 00:06:49.688008 4816 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="b19c46be8bb0c6604bb4f702d14472b782280ce716a3b5fe87a4fa4f1b15b174" exitCode=0 Mar 16 00:06:49 crc kubenswrapper[4816]: I0316 00:06:49.688086 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"b19c46be8bb0c6604bb4f702d14472b782280ce716a3b5fe87a4fa4f1b15b174"} Mar 16 00:06:49 crc kubenswrapper[4816]: I0316 00:06:49.688219 4816 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 16 00:06:49 crc kubenswrapper[4816]: I0316 00:06:49.689608 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:06:49 crc kubenswrapper[4816]: I0316 00:06:49.689677 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:06:49 crc kubenswrapper[4816]: I0316 00:06:49.689710 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:06:49 crc kubenswrapper[4816]: I0316 00:06:49.690095 4816 generic.go:334] "Generic (PLEG): container finished" podID="d1b160f5dda77d281dd8e69ec8d817f9" containerID="59fcd3602ca5395b4f9b934768cad48d5504480af46a1fff8203eb46415b8769" exitCode=0 Mar 16 00:06:49 crc kubenswrapper[4816]: I0316 00:06:49.690132 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerDied","Data":"59fcd3602ca5395b4f9b934768cad48d5504480af46a1fff8203eb46415b8769"} Mar 16 00:06:49 crc kubenswrapper[4816]: I0316 00:06:49.690183 4816 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 16 00:06:49 crc kubenswrapper[4816]: I0316 00:06:49.691071 4816 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 16 00:06:49 crc kubenswrapper[4816]: I0316 00:06:49.691783 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:06:49 crc kubenswrapper[4816]: I0316 00:06:49.691846 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:06:49 crc kubenswrapper[4816]: I0316 00:06:49.691871 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:06:49 crc kubenswrapper[4816]: I0316 00:06:49.693429 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:06:49 crc kubenswrapper[4816]: I0316 00:06:49.693469 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:06:49 crc kubenswrapper[4816]: I0316 00:06:49.693483 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:06:49 crc kubenswrapper[4816]: I0316 00:06:49.693764 4816 generic.go:334] "Generic (PLEG): container finished" podID="3dcd261975c3d6b9a6ad6367fd4facd3" containerID="e44b0bbe4cebe4e52726f3f654f8b4a2953abcd0ad32db752c4641b3f0be1b00" exitCode=0 Mar 16 00:06:49 crc kubenswrapper[4816]: I0316 00:06:49.693901 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerDied","Data":"e44b0bbe4cebe4e52726f3f654f8b4a2953abcd0ad32db752c4641b3f0be1b00"} Mar 16 00:06:49 crc kubenswrapper[4816]: I0316 00:06:49.693919 4816 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 16 00:06:49 crc kubenswrapper[4816]: I0316 00:06:49.696134 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:06:49 crc kubenswrapper[4816]: I0316 00:06:49.696165 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:06:49 crc kubenswrapper[4816]: I0316 00:06:49.696178 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:06:49 crc kubenswrapper[4816]: I0316 00:06:49.702193 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"c60304164aa2f8e740ab25d779ef9b0aa66f6acca476bae45a13f4be44e37e33"} Mar 16 00:06:49 crc kubenswrapper[4816]: I0316 00:06:49.702229 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"5afc1a568801d941c814fe5e4eb91df609672a9431082d578d600f68f669aa3f"} Mar 16 00:06:49 crc kubenswrapper[4816]: I0316 00:06:49.702242 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"fea427361067bc1a9d56b7b6699b072b5cdeee8345bf6618b8d81c6848f62098"} Mar 16 00:06:49 crc kubenswrapper[4816]: I0316 00:06:49.702254 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"5f3dfc8f46079b51e52802920a734bf796a00db2cf42501b9f7e202e0e9bc2ac"} Mar 16 00:06:49 crc kubenswrapper[4816]: I0316 00:06:49.702702 4816 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 16 00:06:49 crc kubenswrapper[4816]: I0316 00:06:49.703949 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:06:49 crc kubenswrapper[4816]: I0316 00:06:49.704030 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:06:49 crc kubenswrapper[4816]: I0316 00:06:49.704056 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:06:50 crc kubenswrapper[4816]: E0316 00:06:50.002718 4816 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 38.102.83.158:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.189d29a2d41dfe7d default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-16 00:06:47.574019709 +0000 UTC m=+0.670319702,LastTimestamp:2026-03-16 00:06:47.574019709 +0000 UTC m=+0.670319702,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 16 00:06:50 crc kubenswrapper[4816]: I0316 00:06:50.579095 4816 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.158:6443: connect: connection refused Mar 16 00:06:50 crc kubenswrapper[4816]: E0316 00:06:50.600535 4816 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.158:6443: connect: connection refused" interval="3.2s" Mar 16 00:06:50 crc kubenswrapper[4816]: I0316 00:06:50.710301 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"c81d0c40be9c3912864280cd98481f837ef29f69d85599b39decf5e9d7a97eff"} Mar 16 00:06:50 crc kubenswrapper[4816]: I0316 00:06:50.710368 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"da7b257af12b9ee605dc32a26d9565f866a9cafebb4936a0bde7f42ae9909f80"} Mar 16 00:06:50 crc kubenswrapper[4816]: I0316 00:06:50.710417 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"c2523ea43f2490afb81354be8a2840f54a3be1a8d3154196e0c8ac365d09723a"} Mar 16 00:06:50 crc kubenswrapper[4816]: I0316 00:06:50.710432 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"46db23088e8ba791d736f70a65bd066af80a5ec27c31332d9ac9c52530b21bfd"} Mar 16 00:06:50 crc kubenswrapper[4816]: I0316 00:06:50.712581 4816 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="baceacd0418c0bbcbd2e875008286443b263336ba956d8e4584147ed949b3f69" exitCode=0 Mar 16 00:06:50 crc kubenswrapper[4816]: I0316 00:06:50.712644 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"baceacd0418c0bbcbd2e875008286443b263336ba956d8e4584147ed949b3f69"} Mar 16 00:06:50 crc kubenswrapper[4816]: I0316 00:06:50.712814 4816 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 16 00:06:50 crc kubenswrapper[4816]: I0316 00:06:50.713766 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:06:50 crc kubenswrapper[4816]: I0316 00:06:50.713796 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:06:50 crc kubenswrapper[4816]: I0316 00:06:50.713810 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:06:50 crc kubenswrapper[4816]: I0316 00:06:50.722599 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"f2a41f624c9d40a637cf958648b795587652d7139d4b7cea2e8c70d1cdf05dd6"} Mar 16 00:06:50 crc kubenswrapper[4816]: I0316 00:06:50.722669 4816 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 16 00:06:50 crc kubenswrapper[4816]: I0316 00:06:50.723655 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:06:50 crc kubenswrapper[4816]: I0316 00:06:50.723712 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:06:50 crc kubenswrapper[4816]: I0316 00:06:50.723725 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:06:50 crc kubenswrapper[4816]: I0316 00:06:50.729020 4816 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 16 00:06:50 crc kubenswrapper[4816]: I0316 00:06:50.729371 4816 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 16 00:06:50 crc kubenswrapper[4816]: I0316 00:06:50.729674 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"37dea82cd38ef9cfbda3626b789b21af2ada4b4a58bc2ecc6ce55eda60052277"} Mar 16 00:06:50 crc kubenswrapper[4816]: I0316 00:06:50.729701 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"02c81b3d94044c1f4f344dd8306cacf8db533a662535ddb0a6a4ffc1bb4cf1f5"} Mar 16 00:06:50 crc kubenswrapper[4816]: I0316 00:06:50.729713 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"94c8e0e7b0ecd4d0f525f55538bff0b01653544be87c923fc152d4d982da7021"} Mar 16 00:06:50 crc kubenswrapper[4816]: I0316 00:06:50.730154 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:06:50 crc kubenswrapper[4816]: I0316 00:06:50.730183 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:06:50 crc kubenswrapper[4816]: I0316 00:06:50.730194 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:06:50 crc kubenswrapper[4816]: I0316 00:06:50.730859 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:06:50 crc kubenswrapper[4816]: I0316 00:06:50.730884 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:06:50 crc kubenswrapper[4816]: I0316 00:06:50.730893 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:06:50 crc kubenswrapper[4816]: I0316 00:06:50.870622 4816 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 16 00:06:50 crc kubenswrapper[4816]: I0316 00:06:50.871664 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:06:50 crc kubenswrapper[4816]: I0316 00:06:50.871693 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:06:50 crc kubenswrapper[4816]: I0316 00:06:50.871703 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:06:50 crc kubenswrapper[4816]: I0316 00:06:50.871724 4816 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 16 00:06:50 crc kubenswrapper[4816]: E0316 00:06:50.872311 4816 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.158:6443: connect: connection refused" node="crc" Mar 16 00:06:50 crc kubenswrapper[4816]: W0316 00:06:50.928090 4816 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.158:6443: connect: connection refused Mar 16 00:06:50 crc kubenswrapper[4816]: E0316 00:06:50.928182 4816 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.158:6443: connect: connection refused" logger="UnhandledError" Mar 16 00:06:51 crc kubenswrapper[4816]: W0316 00:06:51.147414 4816 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.158:6443: connect: connection refused Mar 16 00:06:51 crc kubenswrapper[4816]: E0316 00:06:51.147526 4816 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.158:6443: connect: connection refused" logger="UnhandledError" Mar 16 00:06:51 crc kubenswrapper[4816]: I0316 00:06:51.733888 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"9285f97ca65307db81fc7bd5712ac11ee1561d8c18d684087969ba4d57ea8651"} Mar 16 00:06:51 crc kubenswrapper[4816]: I0316 00:06:51.734027 4816 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 16 00:06:51 crc kubenswrapper[4816]: I0316 00:06:51.734952 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:06:51 crc kubenswrapper[4816]: I0316 00:06:51.734974 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:06:51 crc kubenswrapper[4816]: I0316 00:06:51.734981 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:06:51 crc kubenswrapper[4816]: I0316 00:06:51.737467 4816 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="24e7adc2ddce8a281cd270db47b3528d2a4965ab6d41a14ecb5aaea02daf0c45" exitCode=0 Mar 16 00:06:51 crc kubenswrapper[4816]: I0316 00:06:51.737580 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"24e7adc2ddce8a281cd270db47b3528d2a4965ab6d41a14ecb5aaea02daf0c45"} Mar 16 00:06:51 crc kubenswrapper[4816]: I0316 00:06:51.737596 4816 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 16 00:06:51 crc kubenswrapper[4816]: I0316 00:06:51.737662 4816 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 16 00:06:51 crc kubenswrapper[4816]: I0316 00:06:51.737697 4816 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 16 00:06:51 crc kubenswrapper[4816]: I0316 00:06:51.737863 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 16 00:06:51 crc kubenswrapper[4816]: I0316 00:06:51.738675 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:06:51 crc kubenswrapper[4816]: I0316 00:06:51.738700 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:06:51 crc kubenswrapper[4816]: I0316 00:06:51.738711 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:06:51 crc kubenswrapper[4816]: I0316 00:06:51.738816 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:06:51 crc kubenswrapper[4816]: I0316 00:06:51.738854 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:06:51 crc kubenswrapper[4816]: I0316 00:06:51.738868 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:06:51 crc kubenswrapper[4816]: I0316 00:06:51.739388 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:06:51 crc kubenswrapper[4816]: I0316 00:06:51.739432 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:06:51 crc kubenswrapper[4816]: I0316 00:06:51.739456 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:06:51 crc kubenswrapper[4816]: I0316 00:06:51.808824 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 16 00:06:52 crc kubenswrapper[4816]: I0316 00:06:52.582987 4816 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 16 00:06:52 crc kubenswrapper[4816]: I0316 00:06:52.583290 4816 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 16 00:06:52 crc kubenswrapper[4816]: I0316 00:06:52.585251 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:06:52 crc kubenswrapper[4816]: I0316 00:06:52.585313 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:06:52 crc kubenswrapper[4816]: I0316 00:06:52.585327 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:06:52 crc kubenswrapper[4816]: I0316 00:06:52.592608 4816 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 16 00:06:52 crc kubenswrapper[4816]: I0316 00:06:52.745404 4816 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 16 00:06:52 crc kubenswrapper[4816]: I0316 00:06:52.745419 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"15b5a3223ff0fb5b96e5b87ee836add66498165759f4d6ed8cb6413f091967e2"} Mar 16 00:06:52 crc kubenswrapper[4816]: I0316 00:06:52.745496 4816 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 16 00:06:52 crc kubenswrapper[4816]: I0316 00:06:52.745530 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"508e6dbe6f2f1392d16501e09ee2ec523a3c2a245cd96c6ab773e26e83b8bb6a"} Mar 16 00:06:52 crc kubenswrapper[4816]: I0316 00:06:52.745574 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 16 00:06:52 crc kubenswrapper[4816]: I0316 00:06:52.745588 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"fbe90b49c5d42bb9d6d5d1b8f1d02b571403dd3f39d3118081351d72c4910914"} Mar 16 00:06:52 crc kubenswrapper[4816]: I0316 00:06:52.745605 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"c8b061f16a65a27b9a5ef66231376b70fec084479a991b7fdd430957df76da32"} Mar 16 00:06:52 crc kubenswrapper[4816]: I0316 00:06:52.745618 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 16 00:06:52 crc kubenswrapper[4816]: I0316 00:06:52.745451 4816 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 16 00:06:52 crc kubenswrapper[4816]: I0316 00:06:52.746706 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:06:52 crc kubenswrapper[4816]: I0316 00:06:52.746752 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:06:52 crc kubenswrapper[4816]: I0316 00:06:52.746767 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:06:52 crc kubenswrapper[4816]: I0316 00:06:52.746973 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:06:52 crc kubenswrapper[4816]: I0316 00:06:52.747045 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:06:52 crc kubenswrapper[4816]: I0316 00:06:52.747065 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:06:52 crc kubenswrapper[4816]: I0316 00:06:52.747534 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:06:52 crc kubenswrapper[4816]: I0316 00:06:52.747610 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:06:52 crc kubenswrapper[4816]: I0316 00:06:52.747626 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:06:53 crc kubenswrapper[4816]: I0316 00:06:53.553498 4816 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 16 00:06:53 crc kubenswrapper[4816]: I0316 00:06:53.756884 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"603bdcbd4a6af5fda7ca09e4823e0db8d7ec3e3eb38eddf9f062a14b433cb094"} Mar 16 00:06:53 crc kubenswrapper[4816]: I0316 00:06:53.756972 4816 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 16 00:06:53 crc kubenswrapper[4816]: I0316 00:06:53.757078 4816 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 16 00:06:53 crc kubenswrapper[4816]: I0316 00:06:53.757207 4816 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 16 00:06:53 crc kubenswrapper[4816]: I0316 00:06:53.759098 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:06:53 crc kubenswrapper[4816]: I0316 00:06:53.759168 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:06:53 crc kubenswrapper[4816]: I0316 00:06:53.759190 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:06:53 crc kubenswrapper[4816]: I0316 00:06:53.759650 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:06:53 crc kubenswrapper[4816]: I0316 00:06:53.759703 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:06:53 crc kubenswrapper[4816]: I0316 00:06:53.759736 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:06:53 crc kubenswrapper[4816]: I0316 00:06:53.759768 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:06:53 crc kubenswrapper[4816]: I0316 00:06:53.759744 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:06:53 crc kubenswrapper[4816]: I0316 00:06:53.759815 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:06:54 crc kubenswrapper[4816]: I0316 00:06:54.073249 4816 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 16 00:06:54 crc kubenswrapper[4816]: I0316 00:06:54.080333 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:06:54 crc kubenswrapper[4816]: I0316 00:06:54.080383 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:06:54 crc kubenswrapper[4816]: I0316 00:06:54.080401 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:06:54 crc kubenswrapper[4816]: I0316 00:06:54.080443 4816 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 16 00:06:54 crc kubenswrapper[4816]: I0316 00:06:54.660197 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 16 00:06:54 crc kubenswrapper[4816]: I0316 00:06:54.760968 4816 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 16 00:06:54 crc kubenswrapper[4816]: I0316 00:06:54.761183 4816 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 16 00:06:54 crc kubenswrapper[4816]: I0316 00:06:54.762364 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:06:54 crc kubenswrapper[4816]: I0316 00:06:54.762446 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:06:54 crc kubenswrapper[4816]: I0316 00:06:54.762480 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:06:54 crc kubenswrapper[4816]: I0316 00:06:54.762829 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:06:54 crc kubenswrapper[4816]: I0316 00:06:54.762890 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:06:54 crc kubenswrapper[4816]: I0316 00:06:54.762912 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:06:55 crc kubenswrapper[4816]: I0316 00:06:55.179625 4816 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 16 00:06:55 crc kubenswrapper[4816]: I0316 00:06:55.179978 4816 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 16 00:06:55 crc kubenswrapper[4816]: I0316 00:06:55.181664 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:06:55 crc kubenswrapper[4816]: I0316 00:06:55.181722 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:06:55 crc kubenswrapper[4816]: I0316 00:06:55.181739 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:06:56 crc kubenswrapper[4816]: I0316 00:06:56.292820 4816 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-etcd/etcd-crc" Mar 16 00:06:56 crc kubenswrapper[4816]: I0316 00:06:56.292999 4816 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 16 00:06:56 crc kubenswrapper[4816]: I0316 00:06:56.294241 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:06:56 crc kubenswrapper[4816]: I0316 00:06:56.294397 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:06:56 crc kubenswrapper[4816]: I0316 00:06:56.294428 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:06:56 crc kubenswrapper[4816]: I0316 00:06:56.901887 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-etcd/etcd-crc" Mar 16 00:06:56 crc kubenswrapper[4816]: I0316 00:06:56.902257 4816 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 16 00:06:56 crc kubenswrapper[4816]: I0316 00:06:56.904148 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:06:56 crc kubenswrapper[4816]: I0316 00:06:56.904195 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:06:56 crc kubenswrapper[4816]: I0316 00:06:56.904215 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:06:57 crc kubenswrapper[4816]: E0316 00:06:57.769261 4816 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 16 00:06:58 crc kubenswrapper[4816]: I0316 00:06:58.734870 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 16 00:06:58 crc kubenswrapper[4816]: I0316 00:06:58.735086 4816 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 16 00:06:58 crc kubenswrapper[4816]: I0316 00:06:58.736859 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:06:58 crc kubenswrapper[4816]: I0316 00:06:58.736956 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:06:58 crc kubenswrapper[4816]: I0316 00:06:58.736977 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:06:58 crc kubenswrapper[4816]: I0316 00:06:58.871903 4816 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 16 00:06:58 crc kubenswrapper[4816]: I0316 00:06:58.872229 4816 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 16 00:06:58 crc kubenswrapper[4816]: I0316 00:06:58.874329 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:06:58 crc kubenswrapper[4816]: I0316 00:06:58.874388 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:06:58 crc kubenswrapper[4816]: I0316 00:06:58.874406 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:07:01 crc kubenswrapper[4816]: I0316 00:07:01.292336 4816 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" start-of-body= Mar 16 00:07:01 crc kubenswrapper[4816]: I0316 00:07:01.293008 4816 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" Mar 16 00:07:01 crc kubenswrapper[4816]: W0316 00:07:01.561166 4816 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": net/http: TLS handshake timeout Mar 16 00:07:01 crc kubenswrapper[4816]: I0316 00:07:01.561326 4816 trace.go:236] Trace[2030524574]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (16-Mar-2026 00:06:51.559) (total time: 10001ms): Mar 16 00:07:01 crc kubenswrapper[4816]: Trace[2030524574]: ---"Objects listed" error:Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": net/http: TLS handshake timeout 10001ms (00:07:01.561) Mar 16 00:07:01 crc kubenswrapper[4816]: Trace[2030524574]: [10.001807799s] [10.001807799s] END Mar 16 00:07:01 crc kubenswrapper[4816]: E0316 00:07:01.561371 4816 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": net/http: TLS handshake timeout" logger="UnhandledError" Mar 16 00:07:01 crc kubenswrapper[4816]: I0316 00:07:01.579484 4816 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": net/http: TLS handshake timeout Mar 16 00:07:01 crc kubenswrapper[4816]: W0316 00:07:01.625094 4816 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": net/http: TLS handshake timeout Mar 16 00:07:01 crc kubenswrapper[4816]: I0316 00:07:01.625444 4816 trace.go:236] Trace[1896188209]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (16-Mar-2026 00:06:51.623) (total time: 10002ms): Mar 16 00:07:01 crc kubenswrapper[4816]: Trace[1896188209]: ---"Objects listed" error:Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": net/http: TLS handshake timeout 10001ms (00:07:01.625) Mar 16 00:07:01 crc kubenswrapper[4816]: Trace[1896188209]: [10.002036675s] [10.002036675s] END Mar 16 00:07:01 crc kubenswrapper[4816]: E0316 00:07:01.625619 4816 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": net/http: TLS handshake timeout" logger="UnhandledError" Mar 16 00:07:01 crc kubenswrapper[4816]: I0316 00:07:01.871822 4816 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 16 00:07:01 crc kubenswrapper[4816]: I0316 00:07:01.871884 4816 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 16 00:07:02 crc kubenswrapper[4816]: W0316 00:07:02.305825 4816 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:07:02Z is after 2026-02-23T05:33:13Z Mar 16 00:07:02 crc kubenswrapper[4816]: E0316 00:07:02.306756 4816 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:07:02Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 16 00:07:02 crc kubenswrapper[4816]: W0316 00:07:02.309494 4816 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:07:02Z is after 2026-02-23T05:33:13Z Mar 16 00:07:02 crc kubenswrapper[4816]: I0316 00:07:02.309680 4816 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Mar 16 00:07:02 crc kubenswrapper[4816]: I0316 00:07:02.309784 4816 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Mar 16 00:07:02 crc kubenswrapper[4816]: E0316 00:07:02.309921 4816 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:07:02Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 16 00:07:02 crc kubenswrapper[4816]: E0316 00:07:02.313040 4816 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:07:02Z is after 2026-02-23T05:33:13Z" event="&Event{ObjectMeta:{crc.189d29a2d41dfe7d default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-16 00:06:47.574019709 +0000 UTC m=+0.670319702,LastTimestamp:2026-03-16 00:06:47.574019709 +0000 UTC m=+0.670319702,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 16 00:07:02 crc kubenswrapper[4816]: I0316 00:07:02.317429 4816 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Mar 16 00:07:02 crc kubenswrapper[4816]: I0316 00:07:02.317669 4816 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Mar 16 00:07:02 crc kubenswrapper[4816]: E0316 00:07:02.318395 4816 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:07:02Z is after 2026-02-23T05:33:13Z" interval="6.4s" Mar 16 00:07:02 crc kubenswrapper[4816]: E0316 00:07:02.320108 4816 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:07:02Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 16 00:07:02 crc kubenswrapper[4816]: E0316 00:07:02.322075 4816 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:07:02Z is after 2026-02-23T05:33:13Z" node="crc" Mar 16 00:07:02 crc kubenswrapper[4816]: I0316 00:07:02.581348 4816 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:07:02Z is after 2026-02-23T05:33:13Z Mar 16 00:07:02 crc kubenswrapper[4816]: I0316 00:07:02.784914 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Mar 16 00:07:02 crc kubenswrapper[4816]: I0316 00:07:02.788156 4816 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="9285f97ca65307db81fc7bd5712ac11ee1561d8c18d684087969ba4d57ea8651" exitCode=255 Mar 16 00:07:02 crc kubenswrapper[4816]: I0316 00:07:02.788230 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"9285f97ca65307db81fc7bd5712ac11ee1561d8c18d684087969ba4d57ea8651"} Mar 16 00:07:02 crc kubenswrapper[4816]: I0316 00:07:02.788471 4816 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 16 00:07:02 crc kubenswrapper[4816]: I0316 00:07:02.789675 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:07:02 crc kubenswrapper[4816]: I0316 00:07:02.789740 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:07:02 crc kubenswrapper[4816]: I0316 00:07:02.789758 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:07:02 crc kubenswrapper[4816]: I0316 00:07:02.790689 4816 scope.go:117] "RemoveContainer" containerID="9285f97ca65307db81fc7bd5712ac11ee1561d8c18d684087969ba4d57ea8651" Mar 16 00:07:03 crc kubenswrapper[4816]: I0316 00:07:03.582296 4816 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:07:03Z is after 2026-02-23T05:33:13Z Mar 16 00:07:03 crc kubenswrapper[4816]: I0316 00:07:03.794113 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Mar 16 00:07:03 crc kubenswrapper[4816]: I0316 00:07:03.795087 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Mar 16 00:07:03 crc kubenswrapper[4816]: I0316 00:07:03.798328 4816 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="6d233ce0529e5c4ef8adad4fcd1615994765511ce6dba51708a5f933ea9c3a3e" exitCode=255 Mar 16 00:07:03 crc kubenswrapper[4816]: I0316 00:07:03.798386 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"6d233ce0529e5c4ef8adad4fcd1615994765511ce6dba51708a5f933ea9c3a3e"} Mar 16 00:07:03 crc kubenswrapper[4816]: I0316 00:07:03.798501 4816 scope.go:117] "RemoveContainer" containerID="9285f97ca65307db81fc7bd5712ac11ee1561d8c18d684087969ba4d57ea8651" Mar 16 00:07:03 crc kubenswrapper[4816]: I0316 00:07:03.798734 4816 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 16 00:07:03 crc kubenswrapper[4816]: I0316 00:07:03.800243 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:07:03 crc kubenswrapper[4816]: I0316 00:07:03.800280 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:07:03 crc kubenswrapper[4816]: I0316 00:07:03.800294 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:07:03 crc kubenswrapper[4816]: I0316 00:07:03.800988 4816 scope.go:117] "RemoveContainer" containerID="6d233ce0529e5c4ef8adad4fcd1615994765511ce6dba51708a5f933ea9c3a3e" Mar 16 00:07:03 crc kubenswrapper[4816]: E0316 00:07:03.801382 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 16 00:07:04 crc kubenswrapper[4816]: I0316 00:07:04.583094 4816 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:07:04Z is after 2026-02-23T05:33:13Z Mar 16 00:07:04 crc kubenswrapper[4816]: I0316 00:07:04.804185 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Mar 16 00:07:05 crc kubenswrapper[4816]: I0316 00:07:05.188795 4816 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 16 00:07:05 crc kubenswrapper[4816]: I0316 00:07:05.189003 4816 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 16 00:07:05 crc kubenswrapper[4816]: I0316 00:07:05.190728 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:07:05 crc kubenswrapper[4816]: I0316 00:07:05.190810 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:07:05 crc kubenswrapper[4816]: I0316 00:07:05.190838 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:07:05 crc kubenswrapper[4816]: I0316 00:07:05.191960 4816 scope.go:117] "RemoveContainer" containerID="6d233ce0529e5c4ef8adad4fcd1615994765511ce6dba51708a5f933ea9c3a3e" Mar 16 00:07:05 crc kubenswrapper[4816]: E0316 00:07:05.192369 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 16 00:07:05 crc kubenswrapper[4816]: I0316 00:07:05.196912 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 16 00:07:05 crc kubenswrapper[4816]: I0316 00:07:05.584366 4816 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:07:05Z is after 2026-02-23T05:33:13Z Mar 16 00:07:05 crc kubenswrapper[4816]: I0316 00:07:05.810253 4816 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 16 00:07:05 crc kubenswrapper[4816]: I0316 00:07:05.811681 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:07:05 crc kubenswrapper[4816]: I0316 00:07:05.811769 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:07:05 crc kubenswrapper[4816]: I0316 00:07:05.811797 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:07:05 crc kubenswrapper[4816]: I0316 00:07:05.813220 4816 scope.go:117] "RemoveContainer" containerID="6d233ce0529e5c4ef8adad4fcd1615994765511ce6dba51708a5f933ea9c3a3e" Mar 16 00:07:05 crc kubenswrapper[4816]: E0316 00:07:05.813691 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 16 00:07:06 crc kubenswrapper[4816]: W0316 00:07:06.256154 4816 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:07:06Z is after 2026-02-23T05:33:13Z Mar 16 00:07:06 crc kubenswrapper[4816]: E0316 00:07:06.256271 4816 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:07:06Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 16 00:07:06 crc kubenswrapper[4816]: I0316 00:07:06.335791 4816 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-etcd/etcd-crc" Mar 16 00:07:06 crc kubenswrapper[4816]: I0316 00:07:06.336079 4816 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 16 00:07:06 crc kubenswrapper[4816]: I0316 00:07:06.337867 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:07:06 crc kubenswrapper[4816]: I0316 00:07:06.337958 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:07:06 crc kubenswrapper[4816]: I0316 00:07:06.337982 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:07:06 crc kubenswrapper[4816]: I0316 00:07:06.358366 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-etcd/etcd-crc" Mar 16 00:07:06 crc kubenswrapper[4816]: I0316 00:07:06.583409 4816 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:07:06Z is after 2026-02-23T05:33:13Z Mar 16 00:07:06 crc kubenswrapper[4816]: W0316 00:07:06.789685 4816 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:07:06Z is after 2026-02-23T05:33:13Z Mar 16 00:07:06 crc kubenswrapper[4816]: E0316 00:07:06.789820 4816 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:07:06Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 16 00:07:06 crc kubenswrapper[4816]: I0316 00:07:06.807948 4816 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 16 00:07:06 crc kubenswrapper[4816]: I0316 00:07:06.812923 4816 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 16 00:07:06 crc kubenswrapper[4816]: I0316 00:07:06.812971 4816 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 16 00:07:06 crc kubenswrapper[4816]: I0316 00:07:06.814898 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:07:06 crc kubenswrapper[4816]: I0316 00:07:06.814923 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:07:06 crc kubenswrapper[4816]: I0316 00:07:06.814960 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:07:06 crc kubenswrapper[4816]: I0316 00:07:06.814980 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:07:06 crc kubenswrapper[4816]: I0316 00:07:06.814980 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:07:06 crc kubenswrapper[4816]: I0316 00:07:06.815070 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:07:06 crc kubenswrapper[4816]: I0316 00:07:06.818620 4816 scope.go:117] "RemoveContainer" containerID="6d233ce0529e5c4ef8adad4fcd1615994765511ce6dba51708a5f933ea9c3a3e" Mar 16 00:07:06 crc kubenswrapper[4816]: E0316 00:07:06.819759 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 16 00:07:07 crc kubenswrapper[4816]: I0316 00:07:07.583540 4816 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:07:07Z is after 2026-02-23T05:33:13Z Mar 16 00:07:07 crc kubenswrapper[4816]: E0316 00:07:07.769497 4816 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 16 00:07:08 crc kubenswrapper[4816]: I0316 00:07:08.581361 4816 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:07:08Z is after 2026-02-23T05:33:13Z Mar 16 00:07:08 crc kubenswrapper[4816]: I0316 00:07:08.722358 4816 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 16 00:07:08 crc kubenswrapper[4816]: E0316 00:07:08.723364 4816 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:07:08Z is after 2026-02-23T05:33:13Z" interval="7s" Mar 16 00:07:08 crc kubenswrapper[4816]: I0316 00:07:08.724014 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:07:08 crc kubenswrapper[4816]: I0316 00:07:08.724092 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:07:08 crc kubenswrapper[4816]: I0316 00:07:08.724118 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:07:08 crc kubenswrapper[4816]: I0316 00:07:08.724173 4816 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 16 00:07:08 crc kubenswrapper[4816]: E0316 00:07:08.731131 4816 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:07:08Z is after 2026-02-23T05:33:13Z" node="crc" Mar 16 00:07:09 crc kubenswrapper[4816]: I0316 00:07:09.582689 4816 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:07:09Z is after 2026-02-23T05:33:13Z Mar 16 00:07:10 crc kubenswrapper[4816]: I0316 00:07:10.337713 4816 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 16 00:07:10 crc kubenswrapper[4816]: E0316 00:07:10.342044 4816 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:07:10Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 16 00:07:10 crc kubenswrapper[4816]: I0316 00:07:10.582606 4816 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:07:10Z is after 2026-02-23T05:33:13Z Mar 16 00:07:11 crc kubenswrapper[4816]: I0316 00:07:11.291786 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 16 00:07:11 crc kubenswrapper[4816]: I0316 00:07:11.292032 4816 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 16 00:07:11 crc kubenswrapper[4816]: I0316 00:07:11.293447 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:07:11 crc kubenswrapper[4816]: I0316 00:07:11.293516 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:07:11 crc kubenswrapper[4816]: I0316 00:07:11.293534 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:07:11 crc kubenswrapper[4816]: I0316 00:07:11.294460 4816 scope.go:117] "RemoveContainer" containerID="6d233ce0529e5c4ef8adad4fcd1615994765511ce6dba51708a5f933ea9c3a3e" Mar 16 00:07:11 crc kubenswrapper[4816]: E0316 00:07:11.294777 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 16 00:07:11 crc kubenswrapper[4816]: I0316 00:07:11.581380 4816 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:07:11Z is after 2026-02-23T05:33:13Z Mar 16 00:07:11 crc kubenswrapper[4816]: W0316 00:07:11.798042 4816 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:07:11Z is after 2026-02-23T05:33:13Z Mar 16 00:07:11 crc kubenswrapper[4816]: E0316 00:07:11.798160 4816 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:07:11Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 16 00:07:11 crc kubenswrapper[4816]: I0316 00:07:11.872192 4816 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 16 00:07:11 crc kubenswrapper[4816]: I0316 00:07:11.872311 4816 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 16 00:07:12 crc kubenswrapper[4816]: E0316 00:07:12.318645 4816 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:07:12Z is after 2026-02-23T05:33:13Z" event="&Event{ObjectMeta:{crc.189d29a2d41dfe7d default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-16 00:06:47.574019709 +0000 UTC m=+0.670319702,LastTimestamp:2026-03-16 00:06:47.574019709 +0000 UTC m=+0.670319702,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 16 00:07:12 crc kubenswrapper[4816]: I0316 00:07:12.581834 4816 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:07:12Z is after 2026-02-23T05:33:13Z Mar 16 00:07:12 crc kubenswrapper[4816]: W0316 00:07:12.760569 4816 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:07:12Z is after 2026-02-23T05:33:13Z Mar 16 00:07:12 crc kubenswrapper[4816]: E0316 00:07:12.760649 4816 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:07:12Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 16 00:07:13 crc kubenswrapper[4816]: I0316 00:07:13.584060 4816 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:07:13Z is after 2026-02-23T05:33:13Z Mar 16 00:07:13 crc kubenswrapper[4816]: W0316 00:07:13.895017 4816 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:07:13Z is after 2026-02-23T05:33:13Z Mar 16 00:07:13 crc kubenswrapper[4816]: E0316 00:07:13.895127 4816 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:07:13Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 16 00:07:14 crc kubenswrapper[4816]: I0316 00:07:14.583636 4816 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:07:14Z is after 2026-02-23T05:33:13Z Mar 16 00:07:15 crc kubenswrapper[4816]: I0316 00:07:15.583581 4816 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:07:15Z is after 2026-02-23T05:33:13Z Mar 16 00:07:15 crc kubenswrapper[4816]: E0316 00:07:15.729490 4816 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:07:15Z is after 2026-02-23T05:33:13Z" interval="7s" Mar 16 00:07:15 crc kubenswrapper[4816]: I0316 00:07:15.731631 4816 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 16 00:07:15 crc kubenswrapper[4816]: I0316 00:07:15.733321 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:07:15 crc kubenswrapper[4816]: I0316 00:07:15.733376 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:07:15 crc kubenswrapper[4816]: I0316 00:07:15.733398 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:07:15 crc kubenswrapper[4816]: I0316 00:07:15.733435 4816 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 16 00:07:15 crc kubenswrapper[4816]: E0316 00:07:15.738404 4816 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:07:15Z is after 2026-02-23T05:33:13Z" node="crc" Mar 16 00:07:16 crc kubenswrapper[4816]: I0316 00:07:16.584027 4816 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:07:16Z is after 2026-02-23T05:33:13Z Mar 16 00:07:17 crc kubenswrapper[4816]: I0316 00:07:17.582077 4816 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:07:17Z is after 2026-02-23T05:33:13Z Mar 16 00:07:17 crc kubenswrapper[4816]: E0316 00:07:17.769669 4816 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 16 00:07:17 crc kubenswrapper[4816]: W0316 00:07:17.911243 4816 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:07:17Z is after 2026-02-23T05:33:13Z Mar 16 00:07:17 crc kubenswrapper[4816]: E0316 00:07:17.911361 4816 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:07:17Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 16 00:07:18 crc kubenswrapper[4816]: I0316 00:07:18.581526 4816 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:07:18Z is after 2026-02-23T05:33:13Z Mar 16 00:07:19 crc kubenswrapper[4816]: I0316 00:07:19.582303 4816 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:07:19Z is after 2026-02-23T05:33:13Z Mar 16 00:07:20 crc kubenswrapper[4816]: I0316 00:07:20.141280 4816 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": read tcp 192.168.126.11:44098->192.168.126.11:10357: read: connection reset by peer" start-of-body= Mar 16 00:07:20 crc kubenswrapper[4816]: I0316 00:07:20.141378 4816 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": read tcp 192.168.126.11:44098->192.168.126.11:10357: read: connection reset by peer" Mar 16 00:07:20 crc kubenswrapper[4816]: I0316 00:07:20.141467 4816 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 16 00:07:20 crc kubenswrapper[4816]: I0316 00:07:20.141818 4816 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 16 00:07:20 crc kubenswrapper[4816]: I0316 00:07:20.143822 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:07:20 crc kubenswrapper[4816]: I0316 00:07:20.143913 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:07:20 crc kubenswrapper[4816]: I0316 00:07:20.143936 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:07:20 crc kubenswrapper[4816]: I0316 00:07:20.145250 4816 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="cluster-policy-controller" containerStatusID={"Type":"cri-o","ID":"fea427361067bc1a9d56b7b6699b072b5cdeee8345bf6618b8d81c6848f62098"} pod="openshift-kube-controller-manager/kube-controller-manager-crc" containerMessage="Container cluster-policy-controller failed startup probe, will be restarted" Mar 16 00:07:20 crc kubenswrapper[4816]: I0316 00:07:20.145627 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" containerID="cri-o://fea427361067bc1a9d56b7b6699b072b5cdeee8345bf6618b8d81c6848f62098" gracePeriod=30 Mar 16 00:07:20 crc kubenswrapper[4816]: I0316 00:07:20.583670 4816 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:07:20Z is after 2026-02-23T05:33:13Z Mar 16 00:07:20 crc kubenswrapper[4816]: I0316 00:07:20.854889 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/0.log" Mar 16 00:07:20 crc kubenswrapper[4816]: I0316 00:07:20.855331 4816 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="fea427361067bc1a9d56b7b6699b072b5cdeee8345bf6618b8d81c6848f62098" exitCode=255 Mar 16 00:07:20 crc kubenswrapper[4816]: I0316 00:07:20.855413 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"fea427361067bc1a9d56b7b6699b072b5cdeee8345bf6618b8d81c6848f62098"} Mar 16 00:07:20 crc kubenswrapper[4816]: I0316 00:07:20.855461 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"b6e7e97dbf041ed59e094f9be34956cb28d4774022777f2371a2eb752937f551"} Mar 16 00:07:20 crc kubenswrapper[4816]: I0316 00:07:20.855653 4816 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 16 00:07:20 crc kubenswrapper[4816]: I0316 00:07:20.857146 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:07:20 crc kubenswrapper[4816]: I0316 00:07:20.857275 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:07:20 crc kubenswrapper[4816]: I0316 00:07:20.857302 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:07:21 crc kubenswrapper[4816]: I0316 00:07:21.582523 4816 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:07:21Z is after 2026-02-23T05:33:13Z Mar 16 00:07:22 crc kubenswrapper[4816]: E0316 00:07:22.324481 4816 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:07:22Z is after 2026-02-23T05:33:13Z" event="&Event{ObjectMeta:{crc.189d29a2d41dfe7d default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-16 00:06:47.574019709 +0000 UTC m=+0.670319702,LastTimestamp:2026-03-16 00:06:47.574019709 +0000 UTC m=+0.670319702,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 16 00:07:22 crc kubenswrapper[4816]: I0316 00:07:22.583683 4816 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:07:22Z is after 2026-02-23T05:33:13Z Mar 16 00:07:22 crc kubenswrapper[4816]: E0316 00:07:22.735415 4816 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:07:22Z is after 2026-02-23T05:33:13Z" interval="7s" Mar 16 00:07:22 crc kubenswrapper[4816]: I0316 00:07:22.739585 4816 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 16 00:07:22 crc kubenswrapper[4816]: I0316 00:07:22.741494 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:07:22 crc kubenswrapper[4816]: I0316 00:07:22.741602 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:07:22 crc kubenswrapper[4816]: I0316 00:07:22.741622 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:07:22 crc kubenswrapper[4816]: I0316 00:07:22.741675 4816 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 16 00:07:22 crc kubenswrapper[4816]: E0316 00:07:22.746791 4816 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:07:22Z is after 2026-02-23T05:33:13Z" node="crc" Mar 16 00:07:23 crc kubenswrapper[4816]: I0316 00:07:23.583580 4816 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:07:23Z is after 2026-02-23T05:33:13Z Mar 16 00:07:24 crc kubenswrapper[4816]: I0316 00:07:24.581979 4816 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:07:24Z is after 2026-02-23T05:33:13Z Mar 16 00:07:24 crc kubenswrapper[4816]: I0316 00:07:24.660697 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 16 00:07:24 crc kubenswrapper[4816]: I0316 00:07:24.660992 4816 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 16 00:07:24 crc kubenswrapper[4816]: I0316 00:07:24.662805 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:07:24 crc kubenswrapper[4816]: I0316 00:07:24.662881 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:07:24 crc kubenswrapper[4816]: I0316 00:07:24.662908 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:07:24 crc kubenswrapper[4816]: I0316 00:07:24.667363 4816 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 16 00:07:24 crc kubenswrapper[4816]: I0316 00:07:24.673272 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:07:24 crc kubenswrapper[4816]: I0316 00:07:24.673350 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:07:24 crc kubenswrapper[4816]: I0316 00:07:24.673377 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:07:24 crc kubenswrapper[4816]: I0316 00:07:24.674433 4816 scope.go:117] "RemoveContainer" containerID="6d233ce0529e5c4ef8adad4fcd1615994765511ce6dba51708a5f933ea9c3a3e" Mar 16 00:07:25 crc kubenswrapper[4816]: I0316 00:07:25.585824 4816 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:07:25Z is after 2026-02-23T05:33:13Z Mar 16 00:07:25 crc kubenswrapper[4816]: I0316 00:07:25.874123 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Mar 16 00:07:25 crc kubenswrapper[4816]: I0316 00:07:25.875254 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Mar 16 00:07:25 crc kubenswrapper[4816]: I0316 00:07:25.877770 4816 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="97295c99d30410e470f248a46f06606331693794a78b950d14f87ec94dc3c6d8" exitCode=255 Mar 16 00:07:25 crc kubenswrapper[4816]: I0316 00:07:25.877825 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"97295c99d30410e470f248a46f06606331693794a78b950d14f87ec94dc3c6d8"} Mar 16 00:07:25 crc kubenswrapper[4816]: I0316 00:07:25.877884 4816 scope.go:117] "RemoveContainer" containerID="6d233ce0529e5c4ef8adad4fcd1615994765511ce6dba51708a5f933ea9c3a3e" Mar 16 00:07:25 crc kubenswrapper[4816]: I0316 00:07:25.878090 4816 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 16 00:07:25 crc kubenswrapper[4816]: I0316 00:07:25.879240 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:07:25 crc kubenswrapper[4816]: I0316 00:07:25.879280 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:07:25 crc kubenswrapper[4816]: I0316 00:07:25.879319 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:07:25 crc kubenswrapper[4816]: I0316 00:07:25.879958 4816 scope.go:117] "RemoveContainer" containerID="97295c99d30410e470f248a46f06606331693794a78b950d14f87ec94dc3c6d8" Mar 16 00:07:25 crc kubenswrapper[4816]: E0316 00:07:25.880221 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 16 00:07:26 crc kubenswrapper[4816]: I0316 00:07:26.580926 4816 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:07:26Z is after 2026-02-23T05:33:13Z Mar 16 00:07:26 crc kubenswrapper[4816]: I0316 00:07:26.808581 4816 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 16 00:07:26 crc kubenswrapper[4816]: I0316 00:07:26.881774 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Mar 16 00:07:26 crc kubenswrapper[4816]: I0316 00:07:26.884169 4816 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 16 00:07:26 crc kubenswrapper[4816]: I0316 00:07:26.885281 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:07:26 crc kubenswrapper[4816]: I0316 00:07:26.885320 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:07:26 crc kubenswrapper[4816]: I0316 00:07:26.885333 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:07:26 crc kubenswrapper[4816]: I0316 00:07:26.885957 4816 scope.go:117] "RemoveContainer" containerID="97295c99d30410e470f248a46f06606331693794a78b950d14f87ec94dc3c6d8" Mar 16 00:07:26 crc kubenswrapper[4816]: E0316 00:07:26.886188 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 16 00:07:27 crc kubenswrapper[4816]: I0316 00:07:27.127015 4816 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 16 00:07:27 crc kubenswrapper[4816]: I0316 00:07:27.149664 4816 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Mar 16 00:07:27 crc kubenswrapper[4816]: I0316 00:07:27.594587 4816 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 16 00:07:27 crc kubenswrapper[4816]: E0316 00:07:27.769963 4816 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 16 00:07:28 crc kubenswrapper[4816]: I0316 00:07:28.587455 4816 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 16 00:07:28 crc kubenswrapper[4816]: I0316 00:07:28.872040 4816 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 16 00:07:28 crc kubenswrapper[4816]: I0316 00:07:28.872795 4816 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 16 00:07:28 crc kubenswrapper[4816]: I0316 00:07:28.874528 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:07:28 crc kubenswrapper[4816]: I0316 00:07:28.874632 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:07:28 crc kubenswrapper[4816]: I0316 00:07:28.874651 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:07:29 crc kubenswrapper[4816]: I0316 00:07:29.586005 4816 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 16 00:07:29 crc kubenswrapper[4816]: E0316 00:07:29.743857 4816 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 16 00:07:29 crc kubenswrapper[4816]: I0316 00:07:29.747975 4816 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 16 00:07:29 crc kubenswrapper[4816]: I0316 00:07:29.750025 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:07:29 crc kubenswrapper[4816]: I0316 00:07:29.750110 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:07:29 crc kubenswrapper[4816]: I0316 00:07:29.750137 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:07:29 crc kubenswrapper[4816]: I0316 00:07:29.750188 4816 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 16 00:07:29 crc kubenswrapper[4816]: E0316 00:07:29.757524 4816 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 16 00:07:30 crc kubenswrapper[4816]: I0316 00:07:30.588095 4816 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 16 00:07:31 crc kubenswrapper[4816]: W0316 00:07:31.162499 4816 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: services is forbidden: User "system:anonymous" cannot list resource "services" in API group "" at the cluster scope Mar 16 00:07:31 crc kubenswrapper[4816]: E0316 00:07:31.162681 4816 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" Mar 16 00:07:31 crc kubenswrapper[4816]: I0316 00:07:31.291862 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 16 00:07:31 crc kubenswrapper[4816]: I0316 00:07:31.292157 4816 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 16 00:07:31 crc kubenswrapper[4816]: I0316 00:07:31.294950 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:07:31 crc kubenswrapper[4816]: I0316 00:07:31.295030 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:07:31 crc kubenswrapper[4816]: I0316 00:07:31.295047 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:07:31 crc kubenswrapper[4816]: I0316 00:07:31.295963 4816 scope.go:117] "RemoveContainer" containerID="97295c99d30410e470f248a46f06606331693794a78b950d14f87ec94dc3c6d8" Mar 16 00:07:31 crc kubenswrapper[4816]: E0316 00:07:31.296203 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 16 00:07:31 crc kubenswrapper[4816]: I0316 00:07:31.586876 4816 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 16 00:07:31 crc kubenswrapper[4816]: I0316 00:07:31.873592 4816 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 16 00:07:31 crc kubenswrapper[4816]: I0316 00:07:31.873688 4816 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 16 00:07:32 crc kubenswrapper[4816]: E0316 00:07:32.332482 4816 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189d29a2d41dfe7d default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-16 00:06:47.574019709 +0000 UTC m=+0.670319702,LastTimestamp:2026-03-16 00:06:47.574019709 +0000 UTC m=+0.670319702,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 16 00:07:32 crc kubenswrapper[4816]: E0316 00:07:32.341299 4816 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189d29a2d8eed912 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-16 00:06:47.654816018 +0000 UTC m=+0.751116011,LastTimestamp:2026-03-16 00:06:47.654816018 +0000 UTC m=+0.751116011,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 16 00:07:32 crc kubenswrapper[4816]: E0316 00:07:32.348893 4816 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189d29a2d8ef4dca default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-16 00:06:47.654845898 +0000 UTC m=+0.751145891,LastTimestamp:2026-03-16 00:06:47.654845898 +0000 UTC m=+0.751145891,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 16 00:07:32 crc kubenswrapper[4816]: E0316 00:07:32.356474 4816 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189d29a2d8ef9bb9 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-16 00:06:47.654865849 +0000 UTC m=+0.751165852,LastTimestamp:2026-03-16 00:06:47.654865849 +0000 UTC m=+0.751165852,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 16 00:07:32 crc kubenswrapper[4816]: E0316 00:07:32.363401 4816 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189d29a2df2b0113 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeAllocatableEnforced,Message:Updated Node Allocatable limit across pods,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-16 00:06:47.759421715 +0000 UTC m=+0.855721668,LastTimestamp:2026-03-16 00:06:47.759421715 +0000 UTC m=+0.855721668,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 16 00:07:32 crc kubenswrapper[4816]: E0316 00:07:32.371133 4816 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189d29a2d8eed912\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189d29a2d8eed912 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-16 00:06:47.654816018 +0000 UTC m=+0.751116011,LastTimestamp:2026-03-16 00:06:47.768479757 +0000 UTC m=+0.864779710,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 16 00:07:32 crc kubenswrapper[4816]: E0316 00:07:32.378926 4816 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189d29a2d8ef4dca\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189d29a2d8ef4dca default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-16 00:06:47.654845898 +0000 UTC m=+0.751145891,LastTimestamp:2026-03-16 00:06:47.768507708 +0000 UTC m=+0.864807661,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 16 00:07:32 crc kubenswrapper[4816]: E0316 00:07:32.386027 4816 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189d29a2d8ef9bb9\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189d29a2d8ef9bb9 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-16 00:06:47.654865849 +0000 UTC m=+0.751165852,LastTimestamp:2026-03-16 00:06:47.768519078 +0000 UTC m=+0.864819031,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 16 00:07:32 crc kubenswrapper[4816]: E0316 00:07:32.393258 4816 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189d29a2d8eed912\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189d29a2d8eed912 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-16 00:06:47.654816018 +0000 UTC m=+0.751116011,LastTimestamp:2026-03-16 00:06:47.770283342 +0000 UTC m=+0.866583335,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 16 00:07:32 crc kubenswrapper[4816]: E0316 00:07:32.400859 4816 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189d29a2d8ef4dca\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189d29a2d8ef4dca default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-16 00:06:47.654845898 +0000 UTC m=+0.751145891,LastTimestamp:2026-03-16 00:06:47.770308312 +0000 UTC m=+0.866608305,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 16 00:07:32 crc kubenswrapper[4816]: E0316 00:07:32.408538 4816 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189d29a2d8ef9bb9\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189d29a2d8ef9bb9 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-16 00:06:47.654865849 +0000 UTC m=+0.751165852,LastTimestamp:2026-03-16 00:06:47.770326442 +0000 UTC m=+0.866626425,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 16 00:07:32 crc kubenswrapper[4816]: E0316 00:07:32.415488 4816 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189d29a2d8eed912\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189d29a2d8eed912 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-16 00:06:47.654816018 +0000 UTC m=+0.751116011,LastTimestamp:2026-03-16 00:06:47.771480614 +0000 UTC m=+0.867780557,Count:4,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 16 00:07:32 crc kubenswrapper[4816]: E0316 00:07:32.422588 4816 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189d29a2d8ef4dca\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189d29a2d8ef4dca default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-16 00:06:47.654845898 +0000 UTC m=+0.751145891,LastTimestamp:2026-03-16 00:06:47.771504345 +0000 UTC m=+0.867804298,Count:4,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 16 00:07:32 crc kubenswrapper[4816]: E0316 00:07:32.429526 4816 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189d29a2d8ef9bb9\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189d29a2d8ef9bb9 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-16 00:06:47.654865849 +0000 UTC m=+0.751165852,LastTimestamp:2026-03-16 00:06:47.771524235 +0000 UTC m=+0.867824188,Count:4,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 16 00:07:32 crc kubenswrapper[4816]: E0316 00:07:32.436729 4816 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189d29a2d8eed912\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189d29a2d8eed912 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-16 00:06:47.654816018 +0000 UTC m=+0.751116011,LastTimestamp:2026-03-16 00:06:47.771930293 +0000 UTC m=+0.868230286,Count:5,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 16 00:07:32 crc kubenswrapper[4816]: E0316 00:07:32.444848 4816 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189d29a2d8ef4dca\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189d29a2d8ef4dca default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-16 00:06:47.654845898 +0000 UTC m=+0.751145891,LastTimestamp:2026-03-16 00:06:47.771959423 +0000 UTC m=+0.868259416,Count:5,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 16 00:07:32 crc kubenswrapper[4816]: E0316 00:07:32.452041 4816 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189d29a2d8ef9bb9\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189d29a2d8ef9bb9 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-16 00:06:47.654865849 +0000 UTC m=+0.751165852,LastTimestamp:2026-03-16 00:06:47.771982094 +0000 UTC m=+0.868282087,Count:5,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 16 00:07:32 crc kubenswrapper[4816]: E0316 00:07:32.459414 4816 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189d29a2d8eed912\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189d29a2d8eed912 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-16 00:06:47.654816018 +0000 UTC m=+0.751116011,LastTimestamp:2026-03-16 00:06:47.772875671 +0000 UTC m=+0.869175624,Count:6,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 16 00:07:32 crc kubenswrapper[4816]: E0316 00:07:32.468484 4816 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189d29a2d8ef4dca\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189d29a2d8ef4dca default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-16 00:06:47.654845898 +0000 UTC m=+0.751145891,LastTimestamp:2026-03-16 00:06:47.772904371 +0000 UTC m=+0.869204324,Count:6,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 16 00:07:32 crc kubenswrapper[4816]: E0316 00:07:32.474686 4816 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189d29a2d8ef9bb9\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189d29a2d8ef9bb9 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-16 00:06:47.654865849 +0000 UTC m=+0.751165852,LastTimestamp:2026-03-16 00:06:47.772914312 +0000 UTC m=+0.869214265,Count:6,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 16 00:07:32 crc kubenswrapper[4816]: E0316 00:07:32.482095 4816 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189d29a2d8eed912\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189d29a2d8eed912 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-16 00:06:47.654816018 +0000 UTC m=+0.751116011,LastTimestamp:2026-03-16 00:06:47.773514023 +0000 UTC m=+0.869814016,Count:7,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 16 00:07:32 crc kubenswrapper[4816]: E0316 00:07:32.487989 4816 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189d29a2d8ef4dca\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189d29a2d8ef4dca default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-16 00:06:47.654845898 +0000 UTC m=+0.751145891,LastTimestamp:2026-03-16 00:06:47.773587264 +0000 UTC m=+0.869887257,Count:7,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 16 00:07:32 crc kubenswrapper[4816]: E0316 00:07:32.495228 4816 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189d29a2d8ef9bb9\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189d29a2d8ef9bb9 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-16 00:06:47.654865849 +0000 UTC m=+0.751165852,LastTimestamp:2026-03-16 00:06:47.773607615 +0000 UTC m=+0.869907598,Count:7,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 16 00:07:32 crc kubenswrapper[4816]: E0316 00:07:32.502510 4816 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189d29a2d8eed912\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189d29a2d8eed912 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-16 00:06:47.654816018 +0000 UTC m=+0.751116011,LastTimestamp:2026-03-16 00:06:47.774255527 +0000 UTC m=+0.870555510,Count:8,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 16 00:07:32 crc kubenswrapper[4816]: E0316 00:07:32.511438 4816 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189d29a2d8ef4dca\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189d29a2d8ef4dca default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-16 00:06:47.654845898 +0000 UTC m=+0.751145891,LastTimestamp:2026-03-16 00:06:47.774290908 +0000 UTC m=+0.870590891,Count:8,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 16 00:07:32 crc kubenswrapper[4816]: E0316 00:07:32.522865 4816 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189d29a2f7affde1 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-16 00:06:48.170790369 +0000 UTC m=+1.267090332,LastTimestamp:2026-03-16 00:06:48.170790369 +0000 UTC m=+1.267090332,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 16 00:07:32 crc kubenswrapper[4816]: E0316 00:07:32.530165 4816 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189d29a2f7b1a067 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-16 00:06:48.170897511 +0000 UTC m=+1.267197504,LastTimestamp:2026-03-16 00:06:48.170897511 +0000 UTC m=+1.267197504,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 16 00:07:32 crc kubenswrapper[4816]: E0316 00:07:32.537862 4816 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189d29a2f7b3baf8 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{wait-for-host-port},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-16 00:06:48.171035384 +0000 UTC m=+1.267335347,LastTimestamp:2026-03-16 00:06:48.171035384 +0000 UTC m=+1.267335347,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 16 00:07:32 crc kubenswrapper[4816]: E0316 00:07:32.546009 4816 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189d29a2f8a0499e openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-16 00:06:48.186538398 +0000 UTC m=+1.282838391,LastTimestamp:2026-03-16 00:06:48.186538398 +0000 UTC m=+1.282838391,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 16 00:07:32 crc kubenswrapper[4816]: E0316 00:07:32.554916 4816 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189d29a2f97f1f50 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-16 00:06:48.201142096 +0000 UTC m=+1.297442089,LastTimestamp:2026-03-16 00:06:48.201142096 +0000 UTC m=+1.297442089,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 16 00:07:32 crc kubenswrapper[4816]: E0316 00:07:32.563105 4816 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189d29a31b79fcba openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager},},Reason:Created,Message:Created container kube-controller-manager,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-16 00:06:48.771230906 +0000 UTC m=+1.867530899,LastTimestamp:2026-03-16 00:06:48.771230906 +0000 UTC m=+1.867530899,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 16 00:07:32 crc kubenswrapper[4816]: E0316 00:07:32.567532 4816 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189d29a31c03f106 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{wait-for-host-port},},Reason:Created,Message:Created container wait-for-host-port,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-16 00:06:48.780271878 +0000 UTC m=+1.876571841,LastTimestamp:2026-03-16 00:06:48.780271878 +0000 UTC m=+1.876571841,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 16 00:07:32 crc kubenswrapper[4816]: E0316 00:07:32.569375 4816 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189d29a31c1d710a openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Created,Message:Created container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-16 00:06:48.78194305 +0000 UTC m=+1.878243003,LastTimestamp:2026-03-16 00:06:48.78194305 +0000 UTC m=+1.878243003,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 16 00:07:32 crc kubenswrapper[4816]: E0316 00:07:32.574256 4816 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189d29a31c6ef869 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Created,Message:Created container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-16 00:06:48.787286121 +0000 UTC m=+1.883586074,LastTimestamp:2026-03-16 00:06:48.787286121 +0000 UTC m=+1.883586074,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 16 00:07:32 crc kubenswrapper[4816]: E0316 00:07:32.576055 4816 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189d29a31c7d3b6b openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager},},Reason:Started,Message:Started container kube-controller-manager,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-16 00:06:48.788220779 +0000 UTC m=+1.884520752,LastTimestamp:2026-03-16 00:06:48.788220779 +0000 UTC m=+1.884520752,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 16 00:07:32 crc kubenswrapper[4816]: E0316 00:07:32.581653 4816 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189d29a31c83da23 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Created,Message:Created container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-16 00:06:48.788654627 +0000 UTC m=+1.884954620,LastTimestamp:2026-03-16 00:06:48.788654627 +0000 UTC m=+1.884954620,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 16 00:07:32 crc kubenswrapper[4816]: I0316 00:07:32.581762 4816 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 16 00:07:32 crc kubenswrapper[4816]: E0316 00:07:32.587577 4816 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189d29a31c981042 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-16 00:06:48.789979202 +0000 UTC m=+1.886279145,LastTimestamp:2026-03-16 00:06:48.789979202 +0000 UTC m=+1.886279145,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 16 00:07:32 crc kubenswrapper[4816]: E0316 00:07:32.594420 4816 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189d29a31d4d45c6 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{wait-for-host-port},},Reason:Started,Message:Started container wait-for-host-port,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-16 00:06:48.801854918 +0000 UTC m=+1.898154891,LastTimestamp:2026-03-16 00:06:48.801854918 +0000 UTC m=+1.898154891,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 16 00:07:32 crc kubenswrapper[4816]: E0316 00:07:32.601687 4816 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189d29a31d738a40 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Started,Message:Started container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-16 00:06:48.804362816 +0000 UTC m=+1.900662779,LastTimestamp:2026-03-16 00:06:48.804362816 +0000 UTC m=+1.900662779,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 16 00:07:32 crc kubenswrapper[4816]: E0316 00:07:32.607885 4816 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189d29a31d8a0a34 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Started,Message:Started container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-16 00:06:48.805837364 +0000 UTC m=+1.902137317,LastTimestamp:2026-03-16 00:06:48.805837364 +0000 UTC m=+1.902137317,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 16 00:07:32 crc kubenswrapper[4816]: E0316 00:07:32.614157 4816 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189d29a31deb1266 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Started,Message:Started container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-16 00:06:48.812196454 +0000 UTC m=+1.908496417,LastTimestamp:2026-03-16 00:06:48.812196454 +0000 UTC m=+1.908496417,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 16 00:07:32 crc kubenswrapper[4816]: E0316 00:07:32.621318 4816 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189d29a32ec9a88d openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Created,Message:Created container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-16 00:06:49.095219341 +0000 UTC m=+2.191519304,LastTimestamp:2026-03-16 00:06:49.095219341 +0000 UTC m=+2.191519304,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 16 00:07:32 crc kubenswrapper[4816]: E0316 00:07:32.629160 4816 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189d29a32fadc4b3 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Started,Message:Started container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-16 00:06:49.110168755 +0000 UTC m=+2.206468748,LastTimestamp:2026-03-16 00:06:49.110168755 +0000 UTC m=+2.206468748,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 16 00:07:32 crc kubenswrapper[4816]: E0316 00:07:32.635881 4816 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189d29a32fc89d41 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-cert-syncer},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-16 00:06:49.111928129 +0000 UTC m=+2.208228122,LastTimestamp:2026-03-16 00:06:49.111928129 +0000 UTC m=+2.208228122,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 16 00:07:32 crc kubenswrapper[4816]: E0316 00:07:32.642818 4816 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189d29a33dbdbfa1 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-cert-syncer},},Reason:Created,Message:Created container kube-controller-manager-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-16 00:06:49.346097057 +0000 UTC m=+2.442397030,LastTimestamp:2026-03-16 00:06:49.346097057 +0000 UTC m=+2.442397030,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 16 00:07:32 crc kubenswrapper[4816]: E0316 00:07:32.649708 4816 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189d29a33e93d854 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-cert-syncer},},Reason:Started,Message:Started container kube-controller-manager-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-16 00:06:49.360128084 +0000 UTC m=+2.456428057,LastTimestamp:2026-03-16 00:06:49.360128084 +0000 UTC m=+2.456428057,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 16 00:07:32 crc kubenswrapper[4816]: E0316 00:07:32.656058 4816 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189d29a33ea5db64 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-recovery-controller},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-16 00:06:49.361308516 +0000 UTC m=+2.457608469,LastTimestamp:2026-03-16 00:06:49.361308516 +0000 UTC m=+2.457608469,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 16 00:07:32 crc kubenswrapper[4816]: E0316 00:07:32.662589 4816 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189d29a34c7cb99d openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-recovery-controller},},Reason:Created,Message:Created container kube-controller-manager-recovery-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-16 00:06:49.593493917 +0000 UTC m=+2.689793870,LastTimestamp:2026-03-16 00:06:49.593493917 +0000 UTC m=+2.689793870,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 16 00:07:32 crc kubenswrapper[4816]: E0316 00:07:32.669800 4816 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189d29a34d7242e7 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-recovery-controller},},Reason:Started,Message:Started container kube-controller-manager-recovery-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-16 00:06:49.609585383 +0000 UTC m=+2.705885336,LastTimestamp:2026-03-16 00:06:49.609585383 +0000 UTC m=+2.705885336,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 16 00:07:32 crc kubenswrapper[4816]: E0316 00:07:32.677031 4816 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189d29a35249cbd0 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-16 00:06:49.690819536 +0000 UTC m=+2.787119489,LastTimestamp:2026-03-16 00:06:49.690819536 +0000 UTC m=+2.787119489,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 16 00:07:32 crc kubenswrapper[4816]: E0316 00:07:32.684515 4816 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189d29a35261c220 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-ensure-env-vars},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-16 00:06:49.69238992 +0000 UTC m=+2.788689873,LastTimestamp:2026-03-16 00:06:49.69238992 +0000 UTC m=+2.788689873,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 16 00:07:32 crc kubenswrapper[4816]: E0316 00:07:32.691828 4816 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189d29a3527da33d openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-16 00:06:49.694217021 +0000 UTC m=+2.790516974,LastTimestamp:2026-03-16 00:06:49.694217021 +0000 UTC m=+2.790516974,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 16 00:07:32 crc kubenswrapper[4816]: E0316 00:07:32.699285 4816 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189d29a352ba8ca5 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-16 00:06:49.698208933 +0000 UTC m=+2.794508926,LastTimestamp:2026-03-16 00:06:49.698208933 +0000 UTC m=+2.794508926,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 16 00:07:32 crc kubenswrapper[4816]: E0316 00:07:32.705685 4816 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189d29a362973661 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Created,Message:Created container kube-apiserver,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-16 00:06:49.964328545 +0000 UTC m=+3.060628498,LastTimestamp:2026-03-16 00:06:49.964328545 +0000 UTC m=+3.060628498,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 16 00:07:32 crc kubenswrapper[4816]: E0316 00:07:32.712169 4816 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189d29a362a0900a openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-ensure-env-vars},},Reason:Created,Message:Created container etcd-ensure-env-vars,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-16 00:06:49.964941322 +0000 UTC m=+3.061241275,LastTimestamp:2026-03-16 00:06:49.964941322 +0000 UTC m=+3.061241275,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 16 00:07:32 crc kubenswrapper[4816]: E0316 00:07:32.719469 4816 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189d29a362a1288b openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Created,Message:Created container kube-rbac-proxy-crio,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-16 00:06:49.964980363 +0000 UTC m=+3.061280316,LastTimestamp:2026-03-16 00:06:49.964980363 +0000 UTC m=+3.061280316,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 16 00:07:32 crc kubenswrapper[4816]: E0316 00:07:32.726474 4816 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189d29a362a7d895 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler},},Reason:Created,Message:Created container kube-scheduler,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-16 00:06:49.965418645 +0000 UTC m=+3.061718598,LastTimestamp:2026-03-16 00:06:49.965418645 +0000 UTC m=+3.061718598,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 16 00:07:32 crc kubenswrapper[4816]: E0316 00:07:32.734435 4816 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189d29a3635f93f7 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Started,Message:Started container kube-rbac-proxy-crio,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-16 00:06:49.977459703 +0000 UTC m=+3.073759656,LastTimestamp:2026-03-16 00:06:49.977459703 +0000 UTC m=+3.073759656,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 16 00:07:32 crc kubenswrapper[4816]: E0316 00:07:32.741051 4816 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189d29a36384f506 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler},},Reason:Started,Message:Started container kube-scheduler,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-16 00:06:49.979909382 +0000 UTC m=+3.076209335,LastTimestamp:2026-03-16 00:06:49.979909382 +0000 UTC m=+3.076209335,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 16 00:07:32 crc kubenswrapper[4816]: E0316 00:07:32.747248 4816 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189d29a363954488 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-cert-syncer},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-16 00:06:49.980978312 +0000 UTC m=+3.077278265,LastTimestamp:2026-03-16 00:06:49.980978312 +0000 UTC m=+3.077278265,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 16 00:07:32 crc kubenswrapper[4816]: E0316 00:07:32.753823 4816 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189d29a363a73dd5 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Started,Message:Started container kube-apiserver,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-16 00:06:49.982156245 +0000 UTC m=+3.078456198,LastTimestamp:2026-03-16 00:06:49.982156245 +0000 UTC m=+3.078456198,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 16 00:07:32 crc kubenswrapper[4816]: E0316 00:07:32.760592 4816 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189d29a363b44e39 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-syncer},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-16 00:06:49.983012409 +0000 UTC m=+3.079312362,LastTimestamp:2026-03-16 00:06:49.983012409 +0000 UTC m=+3.079312362,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 16 00:07:32 crc kubenswrapper[4816]: E0316 00:07:32.768608 4816 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189d29a36437e593 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-ensure-env-vars},},Reason:Started,Message:Started container etcd-ensure-env-vars,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-16 00:06:49.991636371 +0000 UTC m=+3.087936324,LastTimestamp:2026-03-16 00:06:49.991636371 +0000 UTC m=+3.087936324,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 16 00:07:32 crc kubenswrapper[4816]: E0316 00:07:32.780644 4816 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189d29a370370dd6 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-cert-syncer},},Reason:Created,Message:Created container kube-scheduler-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-16 00:06:50.192907734 +0000 UTC m=+3.289207687,LastTimestamp:2026-03-16 00:06:50.192907734 +0000 UTC m=+3.289207687,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 16 00:07:32 crc kubenswrapper[4816]: E0316 00:07:32.783022 4816 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189d29a370522c7e openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-syncer},},Reason:Created,Message:Created container kube-apiserver-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-16 00:06:50.194685054 +0000 UTC m=+3.290985007,LastTimestamp:2026-03-16 00:06:50.194685054 +0000 UTC m=+3.290985007,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 16 00:07:32 crc kubenswrapper[4816]: E0316 00:07:32.789305 4816 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189d29a370ec8c00 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-cert-syncer},},Reason:Started,Message:Started container kube-scheduler-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-16 00:06:50.204802048 +0000 UTC m=+3.301102011,LastTimestamp:2026-03-16 00:06:50.204802048 +0000 UTC m=+3.301102011,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 16 00:07:32 crc kubenswrapper[4816]: E0316 00:07:32.796043 4816 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189d29a371011dbc openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-recovery-controller},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-16 00:06:50.206150076 +0000 UTC m=+3.302450029,LastTimestamp:2026-03-16 00:06:50.206150076 +0000 UTC m=+3.302450029,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 16 00:07:32 crc kubenswrapper[4816]: E0316 00:07:32.802958 4816 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189d29a37120d9b6 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-syncer},},Reason:Started,Message:Started container kube-apiserver-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-16 00:06:50.208229814 +0000 UTC m=+3.304529767,LastTimestamp:2026-03-16 00:06:50.208229814 +0000 UTC m=+3.304529767,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 16 00:07:32 crc kubenswrapper[4816]: E0316 00:07:32.808772 4816 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189d29a371838c6f openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-regeneration-controller},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-16 00:06:50.214698095 +0000 UTC m=+3.310998048,LastTimestamp:2026-03-16 00:06:50.214698095 +0000 UTC m=+3.310998048,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 16 00:07:32 crc kubenswrapper[4816]: E0316 00:07:32.814540 4816 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189d29a37ce5acb4 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-regeneration-controller},},Reason:Created,Message:Created container kube-apiserver-cert-regeneration-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-16 00:06:50.40567826 +0000 UTC m=+3.501978233,LastTimestamp:2026-03-16 00:06:50.40567826 +0000 UTC m=+3.501978233,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 16 00:07:32 crc kubenswrapper[4816]: E0316 00:07:32.818977 4816 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189d29a37d0a9f86 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-recovery-controller},},Reason:Created,Message:Created container kube-scheduler-recovery-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-16 00:06:50.408099718 +0000 UTC m=+3.504399681,LastTimestamp:2026-03-16 00:06:50.408099718 +0000 UTC m=+3.504399681,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 16 00:07:32 crc kubenswrapper[4816]: E0316 00:07:32.824934 4816 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189d29a37df1e6e3 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-regeneration-controller},},Reason:Started,Message:Started container kube-apiserver-cert-regeneration-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-16 00:06:50.423256803 +0000 UTC m=+3.519556786,LastTimestamp:2026-03-16 00:06:50.423256803 +0000 UTC m=+3.519556786,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 16 00:07:32 crc kubenswrapper[4816]: E0316 00:07:32.829687 4816 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189d29a37e13dc60 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-insecure-readyz},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-16 00:06:50.425482336 +0000 UTC m=+3.521782289,LastTimestamp:2026-03-16 00:06:50.425482336 +0000 UTC m=+3.521782289,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 16 00:07:32 crc kubenswrapper[4816]: E0316 00:07:32.834083 4816 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189d29a37e28af62 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-recovery-controller},},Reason:Started,Message:Started container kube-scheduler-recovery-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-16 00:06:50.426847074 +0000 UTC m=+3.523147027,LastTimestamp:2026-03-16 00:06:50.426847074 +0000 UTC m=+3.523147027,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 16 00:07:32 crc kubenswrapper[4816]: E0316 00:07:32.840236 4816 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189d29a38af0d303 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-insecure-readyz},},Reason:Created,Message:Created container kube-apiserver-insecure-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-16 00:06:50.641289987 +0000 UTC m=+3.737589940,LastTimestamp:2026-03-16 00:06:50.641289987 +0000 UTC m=+3.737589940,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 16 00:07:32 crc kubenswrapper[4816]: E0316 00:07:32.846211 4816 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189d29a38bb8a484 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-insecure-readyz},},Reason:Started,Message:Started container kube-apiserver-insecure-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-16 00:06:50.654385284 +0000 UTC m=+3.750685257,LastTimestamp:2026-03-16 00:06:50.654385284 +0000 UTC m=+3.750685257,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 16 00:07:32 crc kubenswrapper[4816]: E0316 00:07:32.852536 4816 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189d29a38bcb4692 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-16 00:06:50.655606418 +0000 UTC m=+3.751906371,LastTimestamp:2026-03-16 00:06:50.655606418 +0000 UTC m=+3.751906371,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 16 00:07:32 crc kubenswrapper[4816]: E0316 00:07:32.860101 4816 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189d29a38f6245ed openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-resources-copy},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-16 00:06:50.715833837 +0000 UTC m=+3.812133780,LastTimestamp:2026-03-16 00:06:50.715833837 +0000 UTC m=+3.812133780,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 16 00:07:32 crc kubenswrapper[4816]: E0316 00:07:32.867280 4816 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189d29a399d30cfd openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Created,Message:Created container kube-apiserver-check-endpoints,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-16 00:06:50.890996989 +0000 UTC m=+3.987296952,LastTimestamp:2026-03-16 00:06:50.890996989 +0000 UTC m=+3.987296952,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 16 00:07:32 crc kubenswrapper[4816]: E0316 00:07:32.871416 4816 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189d29a39aaa9abb openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Started,Message:Started container kube-apiserver-check-endpoints,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-16 00:06:50.905123515 +0000 UTC m=+4.001423468,LastTimestamp:2026-03-16 00:06:50.905123515 +0000 UTC m=+4.001423468,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 16 00:07:32 crc kubenswrapper[4816]: E0316 00:07:32.877877 4816 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189d29a39e533c0a openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-resources-copy},},Reason:Created,Message:Created container etcd-resources-copy,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-16 00:06:50.966506506 +0000 UTC m=+4.062806469,LastTimestamp:2026-03-16 00:06:50.966506506 +0000 UTC m=+4.062806469,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 16 00:07:32 crc kubenswrapper[4816]: E0316 00:07:32.883114 4816 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189d29a39f08dcbc openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-resources-copy},},Reason:Started,Message:Started container etcd-resources-copy,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-16 00:06:50.97840966 +0000 UTC m=+4.074709653,LastTimestamp:2026-03-16 00:06:50.97840966 +0000 UTC m=+4.074709653,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 16 00:07:32 crc kubenswrapper[4816]: E0316 00:07:32.885354 4816 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189d29a3cca663f4 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcdctl},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-16 00:06:51.743708148 +0000 UTC m=+4.840008101,LastTimestamp:2026-03-16 00:06:51.743708148 +0000 UTC m=+4.840008101,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 16 00:07:32 crc kubenswrapper[4816]: E0316 00:07:32.890186 4816 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189d29a3d9f88e2b openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcdctl},},Reason:Created,Message:Created container etcdctl,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-16 00:06:51.967196715 +0000 UTC m=+5.063496708,LastTimestamp:2026-03-16 00:06:51.967196715 +0000 UTC m=+5.063496708,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 16 00:07:32 crc kubenswrapper[4816]: E0316 00:07:32.895020 4816 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189d29a3da7fe8d3 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcdctl},},Reason:Started,Message:Started container etcdctl,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-16 00:06:51.976067283 +0000 UTC m=+5.072367266,LastTimestamp:2026-03-16 00:06:51.976067283 +0000 UTC m=+5.072367266,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 16 00:07:32 crc kubenswrapper[4816]: E0316 00:07:32.899395 4816 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189d29a3da9a9696 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-16 00:06:51.977815702 +0000 UTC m=+5.074115655,LastTimestamp:2026-03-16 00:06:51.977815702 +0000 UTC m=+5.074115655,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 16 00:07:32 crc kubenswrapper[4816]: E0316 00:07:32.905334 4816 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189d29a3e5e12583 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd},},Reason:Created,Message:Created container etcd,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-16 00:06:52.166989187 +0000 UTC m=+5.263289180,LastTimestamp:2026-03-16 00:06:52.166989187 +0000 UTC m=+5.263289180,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 16 00:07:32 crc kubenswrapper[4816]: E0316 00:07:32.912478 4816 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189d29a3e6ff6a07 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd},},Reason:Started,Message:Started container etcd,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-16 00:06:52.185750023 +0000 UTC m=+5.282049986,LastTimestamp:2026-03-16 00:06:52.185750023 +0000 UTC m=+5.282049986,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 16 00:07:32 crc kubenswrapper[4816]: E0316 00:07:32.919003 4816 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189d29a3e722a22d openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-metrics},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-16 00:06:52.188058157 +0000 UTC m=+5.284358150,LastTimestamp:2026-03-16 00:06:52.188058157 +0000 UTC m=+5.284358150,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 16 00:07:32 crc kubenswrapper[4816]: E0316 00:07:32.923370 4816 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189d29a3f5a42894 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-metrics},},Reason:Created,Message:Created container etcd-metrics,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-16 00:06:52.431427732 +0000 UTC m=+5.527727725,LastTimestamp:2026-03-16 00:06:52.431427732 +0000 UTC m=+5.527727725,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 16 00:07:32 crc kubenswrapper[4816]: E0316 00:07:32.927360 4816 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189d29a3f6dccea4 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-metrics},},Reason:Started,Message:Started container etcd-metrics,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-16 00:06:52.451917476 +0000 UTC m=+5.548217469,LastTimestamp:2026-03-16 00:06:52.451917476 +0000 UTC m=+5.548217469,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 16 00:07:32 crc kubenswrapper[4816]: E0316 00:07:32.932490 4816 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189d29a3f6f563d3 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-readyz},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-16 00:06:52.453528531 +0000 UTC m=+5.549828524,LastTimestamp:2026-03-16 00:06:52.453528531 +0000 UTC m=+5.549828524,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 16 00:07:32 crc kubenswrapper[4816]: E0316 00:07:32.936807 4816 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189d29a405d8f234 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-readyz},},Reason:Created,Message:Created container etcd-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-16 00:06:52.703322676 +0000 UTC m=+5.799622629,LastTimestamp:2026-03-16 00:06:52.703322676 +0000 UTC m=+5.799622629,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 16 00:07:32 crc kubenswrapper[4816]: E0316 00:07:32.942997 4816 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189d29a406995351 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-readyz},},Reason:Started,Message:Started container etcd-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-16 00:06:52.715930449 +0000 UTC m=+5.812230402,LastTimestamp:2026-03-16 00:06:52.715930449 +0000 UTC m=+5.812230402,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 16 00:07:32 crc kubenswrapper[4816]: E0316 00:07:32.950590 4816 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189d29a406aa8885 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-rev},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-16 00:06:52.717058181 +0000 UTC m=+5.813358174,LastTimestamp:2026-03-16 00:06:52.717058181 +0000 UTC m=+5.813358174,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 16 00:07:32 crc kubenswrapper[4816]: E0316 00:07:32.957992 4816 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189d29a415166091 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-rev},},Reason:Created,Message:Created container etcd-rev,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-16 00:06:52.959006865 +0000 UTC m=+6.055306828,LastTimestamp:2026-03-16 00:06:52.959006865 +0000 UTC m=+6.055306828,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 16 00:07:32 crc kubenswrapper[4816]: E0316 00:07:32.962159 4816 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189d29a416246ddf openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-rev},},Reason:Started,Message:Started container etcd-rev,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-16 00:06:52.976704991 +0000 UTC m=+6.073004954,LastTimestamp:2026-03-16 00:06:52.976704991 +0000 UTC m=+6.073004954,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 16 00:07:32 crc kubenswrapper[4816]: E0316 00:07:32.969806 4816 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event=< Mar 16 00:07:32 crc kubenswrapper[4816]: &Event{ObjectMeta:{kube-apiserver-crc.189d29a605d4ae68 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:ProbeError,Message:Readiness probe error: Get "https://192.168.126.11:17697/healthz": dial tcp 192.168.126.11:17697: connect: connection refused Mar 16 00:07:32 crc kubenswrapper[4816]: body: Mar 16 00:07:32 crc kubenswrapper[4816]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-16 00:07:01.292977768 +0000 UTC m=+14.389277751,LastTimestamp:2026-03-16 00:07:01.292977768 +0000 UTC m=+14.389277751,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 16 00:07:32 crc kubenswrapper[4816]: > Mar 16 00:07:32 crc kubenswrapper[4816]: E0316 00:07:32.976199 4816 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189d29a605d6e47a openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Unhealthy,Message:Readiness probe failed: Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-16 00:07:01.293122682 +0000 UTC m=+14.389422645,LastTimestamp:2026-03-16 00:07:01.293122682 +0000 UTC m=+14.389422645,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 16 00:07:32 crc kubenswrapper[4816]: E0316 00:07:32.982398 4816 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Mar 16 00:07:32 crc kubenswrapper[4816]: &Event{ObjectMeta:{kube-controller-manager-crc.189d29a62855d360 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) Mar 16 00:07:32 crc kubenswrapper[4816]: body: Mar 16 00:07:32 crc kubenswrapper[4816]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-16 00:07:01.87186672 +0000 UTC m=+14.968166673,LastTimestamp:2026-03-16 00:07:01.87186672 +0000 UTC m=+14.968166673,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 16 00:07:32 crc kubenswrapper[4816]: > Mar 16 00:07:32 crc kubenswrapper[4816]: E0316 00:07:32.989500 4816 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189d29a62856772b openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-16 00:07:01.871908651 +0000 UTC m=+14.968208604,LastTimestamp:2026-03-16 00:07:01.871908651 +0000 UTC m=+14.968208604,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 16 00:07:32 crc kubenswrapper[4816]: E0316 00:07:32.994796 4816 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event=< Mar 16 00:07:32 crc kubenswrapper[4816]: &Event{ObjectMeta:{kube-apiserver-crc.189d29a6426f7023 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:ProbeError,Message:Startup probe error: HTTP probe failed with statuscode: 403 Mar 16 00:07:32 crc kubenswrapper[4816]: body: {"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Mar 16 00:07:32 crc kubenswrapper[4816]: Mar 16 00:07:32 crc kubenswrapper[4816]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-16 00:07:02.309752867 +0000 UTC m=+15.406052820,LastTimestamp:2026-03-16 00:07:02.309752867 +0000 UTC m=+15.406052820,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 16 00:07:32 crc kubenswrapper[4816]: > Mar 16 00:07:33 crc kubenswrapper[4816]: E0316 00:07:33.000699 4816 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189d29a6427062f5 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Unhealthy,Message:Startup probe failed: HTTP probe failed with statuscode: 403,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-16 00:07:02.309815029 +0000 UTC m=+15.406114982,LastTimestamp:2026-03-16 00:07:02.309815029 +0000 UTC m=+15.406114982,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 16 00:07:33 crc kubenswrapper[4816]: E0316 00:07:33.006386 4816 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.189d29a6426f7023\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event=< Mar 16 00:07:33 crc kubenswrapper[4816]: &Event{ObjectMeta:{kube-apiserver-crc.189d29a6426f7023 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:ProbeError,Message:Startup probe error: HTTP probe failed with statuscode: 403 Mar 16 00:07:33 crc kubenswrapper[4816]: body: {"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Mar 16 00:07:33 crc kubenswrapper[4816]: Mar 16 00:07:33 crc kubenswrapper[4816]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-16 00:07:02.309752867 +0000 UTC m=+15.406052820,LastTimestamp:2026-03-16 00:07:02.317635668 +0000 UTC m=+15.413935651,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 16 00:07:33 crc kubenswrapper[4816]: > Mar 16 00:07:33 crc kubenswrapper[4816]: E0316 00:07:33.011876 4816 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.189d29a6427062f5\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189d29a6427062f5 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Unhealthy,Message:Startup probe failed: HTTP probe failed with statuscode: 403,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-16 00:07:02.309815029 +0000 UTC m=+15.406114982,LastTimestamp:2026-03-16 00:07:02.317831834 +0000 UTC m=+15.414131827,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 16 00:07:33 crc kubenswrapper[4816]: E0316 00:07:33.016605 4816 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.189d29a38bcb4692\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189d29a38bcb4692 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-16 00:06:50.655606418 +0000 UTC m=+3.751906371,LastTimestamp:2026-03-16 00:07:02.792238026 +0000 UTC m=+15.888538019,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 16 00:07:33 crc kubenswrapper[4816]: E0316 00:07:33.025196 4816 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189d29a62855d360\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Mar 16 00:07:33 crc kubenswrapper[4816]: &Event{ObjectMeta:{kube-controller-manager-crc.189d29a62855d360 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) Mar 16 00:07:33 crc kubenswrapper[4816]: body: Mar 16 00:07:33 crc kubenswrapper[4816]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-16 00:07:01.87186672 +0000 UTC m=+14.968166673,LastTimestamp:2026-03-16 00:07:11.872281899 +0000 UTC m=+24.968581872,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 16 00:07:33 crc kubenswrapper[4816]: > Mar 16 00:07:33 crc kubenswrapper[4816]: E0316 00:07:33.031025 4816 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189d29a62856772b\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189d29a62856772b openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-16 00:07:01.871908651 +0000 UTC m=+14.968208604,LastTimestamp:2026-03-16 00:07:11.872352241 +0000 UTC m=+24.968652214,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 16 00:07:33 crc kubenswrapper[4816]: E0316 00:07:33.036333 4816 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Mar 16 00:07:33 crc kubenswrapper[4816]: &Event{ObjectMeta:{kube-controller-manager-crc.189d29aa69480dd9 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": read tcp 192.168.126.11:44098->192.168.126.11:10357: read: connection reset by peer Mar 16 00:07:33 crc kubenswrapper[4816]: body: Mar 16 00:07:33 crc kubenswrapper[4816]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-16 00:07:20.141352409 +0000 UTC m=+33.237652392,LastTimestamp:2026-03-16 00:07:20.141352409 +0000 UTC m=+33.237652392,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 16 00:07:33 crc kubenswrapper[4816]: > Mar 16 00:07:33 crc kubenswrapper[4816]: E0316 00:07:33.043521 4816 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189d29aa69492b5d openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": read tcp 192.168.126.11:44098->192.168.126.11:10357: read: connection reset by peer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-16 00:07:20.141425501 +0000 UTC m=+33.237725484,LastTimestamp:2026-03-16 00:07:20.141425501 +0000 UTC m=+33.237725484,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 16 00:07:33 crc kubenswrapper[4816]: E0316 00:07:33.048458 4816 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189d29aa6988ceeb openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Killing,Message:Container cluster-policy-controller failed startup probe, will be restarted,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-16 00:07:20.145596139 +0000 UTC m=+33.241896132,LastTimestamp:2026-03-16 00:07:20.145596139 +0000 UTC m=+33.241896132,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 16 00:07:33 crc kubenswrapper[4816]: E0316 00:07:33.053429 4816 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189d29a31c981042\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189d29a31c981042 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-16 00:06:48.789979202 +0000 UTC m=+1.886279145,LastTimestamp:2026-03-16 00:07:20.163698708 +0000 UTC m=+33.259998701,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 16 00:07:33 crc kubenswrapper[4816]: E0316 00:07:33.058643 4816 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189d29a32ec9a88d\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189d29a32ec9a88d openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Created,Message:Created container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-16 00:06:49.095219341 +0000 UTC m=+2.191519304,LastTimestamp:2026-03-16 00:07:20.406030355 +0000 UTC m=+33.502330348,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 16 00:07:33 crc kubenswrapper[4816]: E0316 00:07:33.063717 4816 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189d29a32fadc4b3\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189d29a32fadc4b3 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Started,Message:Started container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-16 00:06:49.110168755 +0000 UTC m=+2.206468748,LastTimestamp:2026-03-16 00:07:20.41973216 +0000 UTC m=+33.516032123,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 16 00:07:33 crc kubenswrapper[4816]: E0316 00:07:33.074087 4816 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189d29a62855d360\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Mar 16 00:07:33 crc kubenswrapper[4816]: &Event{ObjectMeta:{kube-controller-manager-crc.189d29a62855d360 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) Mar 16 00:07:33 crc kubenswrapper[4816]: body: Mar 16 00:07:33 crc kubenswrapper[4816]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-16 00:07:01.87186672 +0000 UTC m=+14.968166673,LastTimestamp:2026-03-16 00:07:31.873662959 +0000 UTC m=+44.969962952,Count:3,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 16 00:07:33 crc kubenswrapper[4816]: > Mar 16 00:07:33 crc kubenswrapper[4816]: E0316 00:07:33.080503 4816 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189d29a62856772b\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189d29a62856772b openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-16 00:07:01.871908651 +0000 UTC m=+14.968208604,LastTimestamp:2026-03-16 00:07:31.873729691 +0000 UTC m=+44.970029674,Count:3,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 16 00:07:33 crc kubenswrapper[4816]: I0316 00:07:33.585790 4816 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 16 00:07:34 crc kubenswrapper[4816]: I0316 00:07:34.584992 4816 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 16 00:07:35 crc kubenswrapper[4816]: I0316 00:07:35.583527 4816 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 16 00:07:35 crc kubenswrapper[4816]: W0316 00:07:35.589333 4816 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: runtimeclasses.node.k8s.io is forbidden: User "system:anonymous" cannot list resource "runtimeclasses" in API group "node.k8s.io" at the cluster scope Mar 16 00:07:35 crc kubenswrapper[4816]: E0316 00:07:35.589409 4816 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: runtimeclasses.node.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"runtimeclasses\" in API group \"node.k8s.io\" at the cluster scope" logger="UnhandledError" Mar 16 00:07:36 crc kubenswrapper[4816]: I0316 00:07:36.586759 4816 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 16 00:07:36 crc kubenswrapper[4816]: E0316 00:07:36.750904 4816 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 16 00:07:36 crc kubenswrapper[4816]: I0316 00:07:36.757745 4816 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 16 00:07:36 crc kubenswrapper[4816]: I0316 00:07:36.759152 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:07:36 crc kubenswrapper[4816]: I0316 00:07:36.759215 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:07:36 crc kubenswrapper[4816]: I0316 00:07:36.759235 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:07:36 crc kubenswrapper[4816]: I0316 00:07:36.759276 4816 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 16 00:07:36 crc kubenswrapper[4816]: E0316 00:07:36.764040 4816 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 16 00:07:36 crc kubenswrapper[4816]: W0316 00:07:36.888835 4816 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User "system:anonymous" cannot list resource "csidrivers" in API group "storage.k8s.io" at the cluster scope Mar 16 00:07:36 crc kubenswrapper[4816]: E0316 00:07:36.888901 4816 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" Mar 16 00:07:37 crc kubenswrapper[4816]: W0316 00:07:37.333151 4816 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: nodes "crc" is forbidden: User "system:anonymous" cannot list resource "nodes" in API group "" at the cluster scope Mar 16 00:07:37 crc kubenswrapper[4816]: E0316 00:07:37.333704 4816 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: nodes \"crc\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" Mar 16 00:07:37 crc kubenswrapper[4816]: I0316 00:07:37.581775 4816 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 16 00:07:37 crc kubenswrapper[4816]: E0316 00:07:37.770171 4816 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 16 00:07:38 crc kubenswrapper[4816]: I0316 00:07:38.473051 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 16 00:07:38 crc kubenswrapper[4816]: I0316 00:07:38.473305 4816 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 16 00:07:38 crc kubenswrapper[4816]: I0316 00:07:38.475339 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:07:38 crc kubenswrapper[4816]: I0316 00:07:38.475412 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:07:38 crc kubenswrapper[4816]: I0316 00:07:38.475433 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:07:38 crc kubenswrapper[4816]: I0316 00:07:38.587900 4816 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 16 00:07:38 crc kubenswrapper[4816]: I0316 00:07:38.875508 4816 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 16 00:07:38 crc kubenswrapper[4816]: I0316 00:07:38.875671 4816 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 16 00:07:38 crc kubenswrapper[4816]: I0316 00:07:38.876713 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:07:38 crc kubenswrapper[4816]: I0316 00:07:38.876756 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:07:38 crc kubenswrapper[4816]: I0316 00:07:38.876770 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:07:38 crc kubenswrapper[4816]: I0316 00:07:38.880743 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 16 00:07:38 crc kubenswrapper[4816]: I0316 00:07:38.917910 4816 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 16 00:07:38 crc kubenswrapper[4816]: I0316 00:07:38.918663 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:07:38 crc kubenswrapper[4816]: I0316 00:07:38.918699 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:07:38 crc kubenswrapper[4816]: I0316 00:07:38.918709 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:07:39 crc kubenswrapper[4816]: I0316 00:07:39.583510 4816 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 16 00:07:40 crc kubenswrapper[4816]: I0316 00:07:40.583279 4816 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 16 00:07:41 crc kubenswrapper[4816]: I0316 00:07:41.584922 4816 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 16 00:07:42 crc kubenswrapper[4816]: I0316 00:07:42.583096 4816 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 16 00:07:43 crc kubenswrapper[4816]: I0316 00:07:43.586682 4816 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 16 00:07:43 crc kubenswrapper[4816]: E0316 00:07:43.759322 4816 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 16 00:07:43 crc kubenswrapper[4816]: I0316 00:07:43.764461 4816 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 16 00:07:43 crc kubenswrapper[4816]: I0316 00:07:43.766756 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:07:43 crc kubenswrapper[4816]: I0316 00:07:43.766852 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:07:43 crc kubenswrapper[4816]: I0316 00:07:43.766886 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:07:43 crc kubenswrapper[4816]: I0316 00:07:43.766947 4816 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 16 00:07:43 crc kubenswrapper[4816]: E0316 00:07:43.773253 4816 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 16 00:07:44 crc kubenswrapper[4816]: I0316 00:07:44.584854 4816 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 16 00:07:45 crc kubenswrapper[4816]: I0316 00:07:45.586015 4816 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 16 00:07:45 crc kubenswrapper[4816]: I0316 00:07:45.667882 4816 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 16 00:07:45 crc kubenswrapper[4816]: I0316 00:07:45.669427 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:07:45 crc kubenswrapper[4816]: I0316 00:07:45.669490 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:07:45 crc kubenswrapper[4816]: I0316 00:07:45.669517 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:07:45 crc kubenswrapper[4816]: I0316 00:07:45.670533 4816 scope.go:117] "RemoveContainer" containerID="97295c99d30410e470f248a46f06606331693794a78b950d14f87ec94dc3c6d8" Mar 16 00:07:45 crc kubenswrapper[4816]: E0316 00:07:45.670933 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 16 00:07:46 crc kubenswrapper[4816]: I0316 00:07:46.586716 4816 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 16 00:07:47 crc kubenswrapper[4816]: I0316 00:07:47.585615 4816 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 16 00:07:47 crc kubenswrapper[4816]: E0316 00:07:47.770994 4816 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 16 00:07:48 crc kubenswrapper[4816]: I0316 00:07:48.585629 4816 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 16 00:07:49 crc kubenswrapper[4816]: I0316 00:07:49.585635 4816 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 16 00:07:50 crc kubenswrapper[4816]: I0316 00:07:50.586239 4816 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 16 00:07:50 crc kubenswrapper[4816]: E0316 00:07:50.765020 4816 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 16 00:07:50 crc kubenswrapper[4816]: I0316 00:07:50.773955 4816 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 16 00:07:50 crc kubenswrapper[4816]: I0316 00:07:50.775392 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:07:50 crc kubenswrapper[4816]: I0316 00:07:50.775477 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:07:50 crc kubenswrapper[4816]: I0316 00:07:50.775505 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:07:50 crc kubenswrapper[4816]: I0316 00:07:50.775589 4816 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 16 00:07:50 crc kubenswrapper[4816]: E0316 00:07:50.782424 4816 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 16 00:07:51 crc kubenswrapper[4816]: I0316 00:07:51.584926 4816 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 16 00:07:52 crc kubenswrapper[4816]: I0316 00:07:52.324610 4816 csr.go:261] certificate signing request csr-zsmc7 is approved, waiting to be issued Mar 16 00:07:52 crc kubenswrapper[4816]: I0316 00:07:52.335672 4816 csr.go:257] certificate signing request csr-zsmc7 is issued Mar 16 00:07:52 crc kubenswrapper[4816]: I0316 00:07:52.412952 4816 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Mar 16 00:07:52 crc kubenswrapper[4816]: I0316 00:07:52.443857 4816 reconstruct.go:205] "DevicePaths of reconstructed volumes updated" Mar 16 00:07:53 crc kubenswrapper[4816]: I0316 00:07:53.337429 4816 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2027-02-24 05:54:36 +0000 UTC, rotation deadline is 2027-01-09 07:44:42.497379377 +0000 UTC Mar 16 00:07:53 crc kubenswrapper[4816]: I0316 00:07:53.337511 4816 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Waiting 7183h36m49.159874094s for next certificate rotation Mar 16 00:07:57 crc kubenswrapper[4816]: E0316 00:07:57.771136 4816 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 16 00:07:57 crc kubenswrapper[4816]: I0316 00:07:57.782937 4816 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 16 00:07:57 crc kubenswrapper[4816]: I0316 00:07:57.784472 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:07:57 crc kubenswrapper[4816]: I0316 00:07:57.784525 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:07:57 crc kubenswrapper[4816]: I0316 00:07:57.784543 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:07:57 crc kubenswrapper[4816]: I0316 00:07:57.784723 4816 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 16 00:07:57 crc kubenswrapper[4816]: I0316 00:07:57.793732 4816 kubelet_node_status.go:115] "Node was previously registered" node="crc" Mar 16 00:07:57 crc kubenswrapper[4816]: I0316 00:07:57.794034 4816 kubelet_node_status.go:79] "Successfully registered node" node="crc" Mar 16 00:07:57 crc kubenswrapper[4816]: E0316 00:07:57.794067 4816 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": node \"crc\" not found" Mar 16 00:07:57 crc kubenswrapper[4816]: I0316 00:07:57.798864 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:07:57 crc kubenswrapper[4816]: I0316 00:07:57.799057 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:07:57 crc kubenswrapper[4816]: I0316 00:07:57.799192 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:07:57 crc kubenswrapper[4816]: I0316 00:07:57.799328 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:07:57 crc kubenswrapper[4816]: I0316 00:07:57.799457 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:07:57Z","lastTransitionTime":"2026-03-16T00:07:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:07:57 crc kubenswrapper[4816]: E0316 00:07:57.819692 4816 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:07:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:07:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:57Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:07:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:07:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:57Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d8fb348c-4907-4c8f-859a-735976530e03\\\",\\\"systemUUID\\\":\\\"e97abf98-298f-4589-a70a-4cfb5cb2994a\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 16 00:07:57 crc kubenswrapper[4816]: I0316 00:07:57.832119 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:07:57 crc kubenswrapper[4816]: I0316 00:07:57.832171 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:07:57 crc kubenswrapper[4816]: I0316 00:07:57.832189 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:07:57 crc kubenswrapper[4816]: I0316 00:07:57.832216 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:07:57 crc kubenswrapper[4816]: I0316 00:07:57.832235 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:07:57Z","lastTransitionTime":"2026-03-16T00:07:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:07:57 crc kubenswrapper[4816]: E0316 00:07:57.850322 4816 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:07:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:07:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:57Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:07:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:07:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:57Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d8fb348c-4907-4c8f-859a-735976530e03\\\",\\\"systemUUID\\\":\\\"e97abf98-298f-4589-a70a-4cfb5cb2994a\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 16 00:07:57 crc kubenswrapper[4816]: I0316 00:07:57.863523 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:07:57 crc kubenswrapper[4816]: I0316 00:07:57.863622 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:07:57 crc kubenswrapper[4816]: I0316 00:07:57.863644 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:07:57 crc kubenswrapper[4816]: I0316 00:07:57.863674 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:07:57 crc kubenswrapper[4816]: I0316 00:07:57.863696 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:07:57Z","lastTransitionTime":"2026-03-16T00:07:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:07:57 crc kubenswrapper[4816]: E0316 00:07:57.880161 4816 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:07:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:07:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:57Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:07:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:07:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:57Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d8fb348c-4907-4c8f-859a-735976530e03\\\",\\\"systemUUID\\\":\\\"e97abf98-298f-4589-a70a-4cfb5cb2994a\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 16 00:07:57 crc kubenswrapper[4816]: I0316 00:07:57.891219 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:07:57 crc kubenswrapper[4816]: I0316 00:07:57.891305 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:07:57 crc kubenswrapper[4816]: I0316 00:07:57.891326 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:07:57 crc kubenswrapper[4816]: I0316 00:07:57.891357 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:07:57 crc kubenswrapper[4816]: I0316 00:07:57.891377 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:07:57Z","lastTransitionTime":"2026-03-16T00:07:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:07:57 crc kubenswrapper[4816]: E0316 00:07:57.908071 4816 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:07:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:07:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:57Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:07:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:07:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:57Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d8fb348c-4907-4c8f-859a-735976530e03\\\",\\\"systemUUID\\\":\\\"e97abf98-298f-4589-a70a-4cfb5cb2994a\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 16 00:07:57 crc kubenswrapper[4816]: E0316 00:07:57.908196 4816 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 16 00:07:57 crc kubenswrapper[4816]: E0316 00:07:57.908231 4816 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 16 00:07:58 crc kubenswrapper[4816]: E0316 00:07:58.009115 4816 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 16 00:07:58 crc kubenswrapper[4816]: E0316 00:07:58.109704 4816 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 16 00:07:58 crc kubenswrapper[4816]: E0316 00:07:58.210184 4816 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 16 00:07:58 crc kubenswrapper[4816]: E0316 00:07:58.310917 4816 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 16 00:07:58 crc kubenswrapper[4816]: E0316 00:07:58.411438 4816 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 16 00:07:58 crc kubenswrapper[4816]: E0316 00:07:58.512509 4816 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 16 00:07:58 crc kubenswrapper[4816]: E0316 00:07:58.613162 4816 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 16 00:07:58 crc kubenswrapper[4816]: I0316 00:07:58.667397 4816 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 16 00:07:58 crc kubenswrapper[4816]: I0316 00:07:58.669333 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:07:58 crc kubenswrapper[4816]: I0316 00:07:58.669425 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:07:58 crc kubenswrapper[4816]: I0316 00:07:58.669447 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:07:58 crc kubenswrapper[4816]: I0316 00:07:58.670593 4816 scope.go:117] "RemoveContainer" containerID="97295c99d30410e470f248a46f06606331693794a78b950d14f87ec94dc3c6d8" Mar 16 00:07:58 crc kubenswrapper[4816]: E0316 00:07:58.713289 4816 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 16 00:07:58 crc kubenswrapper[4816]: E0316 00:07:58.813938 4816 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 16 00:07:58 crc kubenswrapper[4816]: E0316 00:07:58.914488 4816 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 16 00:07:58 crc kubenswrapper[4816]: I0316 00:07:58.982302 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Mar 16 00:07:58 crc kubenswrapper[4816]: I0316 00:07:58.984954 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"29c4319e41bd025417b11cb87b682a66698ba7575801f247b1192dc8067ab66f"} Mar 16 00:07:58 crc kubenswrapper[4816]: I0316 00:07:58.985212 4816 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 16 00:07:58 crc kubenswrapper[4816]: I0316 00:07:58.986944 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:07:58 crc kubenswrapper[4816]: I0316 00:07:58.986987 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:07:58 crc kubenswrapper[4816]: I0316 00:07:58.987007 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:07:59 crc kubenswrapper[4816]: E0316 00:07:59.015089 4816 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 16 00:07:59 crc kubenswrapper[4816]: E0316 00:07:59.115953 4816 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 16 00:07:59 crc kubenswrapper[4816]: E0316 00:07:59.216943 4816 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 16 00:07:59 crc kubenswrapper[4816]: E0316 00:07:59.317892 4816 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 16 00:07:59 crc kubenswrapper[4816]: E0316 00:07:59.418900 4816 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 16 00:07:59 crc kubenswrapper[4816]: E0316 00:07:59.519913 4816 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 16 00:07:59 crc kubenswrapper[4816]: E0316 00:07:59.620695 4816 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 16 00:07:59 crc kubenswrapper[4816]: E0316 00:07:59.721797 4816 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 16 00:07:59 crc kubenswrapper[4816]: E0316 00:07:59.822492 4816 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 16 00:07:59 crc kubenswrapper[4816]: E0316 00:07:59.923677 4816 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 16 00:07:59 crc kubenswrapper[4816]: I0316 00:07:59.990200 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Mar 16 00:07:59 crc kubenswrapper[4816]: I0316 00:07:59.991373 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Mar 16 00:07:59 crc kubenswrapper[4816]: I0316 00:07:59.994740 4816 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="29c4319e41bd025417b11cb87b682a66698ba7575801f247b1192dc8067ab66f" exitCode=255 Mar 16 00:07:59 crc kubenswrapper[4816]: I0316 00:07:59.994805 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"29c4319e41bd025417b11cb87b682a66698ba7575801f247b1192dc8067ab66f"} Mar 16 00:07:59 crc kubenswrapper[4816]: I0316 00:07:59.994878 4816 scope.go:117] "RemoveContainer" containerID="97295c99d30410e470f248a46f06606331693794a78b950d14f87ec94dc3c6d8" Mar 16 00:07:59 crc kubenswrapper[4816]: I0316 00:07:59.995063 4816 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 16 00:07:59 crc kubenswrapper[4816]: I0316 00:07:59.996250 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:07:59 crc kubenswrapper[4816]: I0316 00:07:59.996301 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:07:59 crc kubenswrapper[4816]: I0316 00:07:59.996321 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:07:59 crc kubenswrapper[4816]: I0316 00:07:59.997261 4816 scope.go:117] "RemoveContainer" containerID="29c4319e41bd025417b11cb87b682a66698ba7575801f247b1192dc8067ab66f" Mar 16 00:07:59 crc kubenswrapper[4816]: E0316 00:07:59.997540 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 16 00:08:00 crc kubenswrapper[4816]: E0316 00:08:00.024279 4816 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 16 00:08:00 crc kubenswrapper[4816]: E0316 00:08:00.124761 4816 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 16 00:08:00 crc kubenswrapper[4816]: E0316 00:08:00.225201 4816 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 16 00:08:00 crc kubenswrapper[4816]: E0316 00:08:00.326007 4816 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 16 00:08:00 crc kubenswrapper[4816]: E0316 00:08:00.426410 4816 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 16 00:08:00 crc kubenswrapper[4816]: E0316 00:08:00.526627 4816 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 16 00:08:00 crc kubenswrapper[4816]: E0316 00:08:00.626828 4816 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 16 00:08:00 crc kubenswrapper[4816]: E0316 00:08:00.726957 4816 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 16 00:08:00 crc kubenswrapper[4816]: E0316 00:08:00.827789 4816 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 16 00:08:00 crc kubenswrapper[4816]: E0316 00:08:00.928922 4816 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 16 00:08:01 crc kubenswrapper[4816]: I0316 00:08:00.999951 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Mar 16 00:08:01 crc kubenswrapper[4816]: E0316 00:08:01.029793 4816 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 16 00:08:01 crc kubenswrapper[4816]: E0316 00:08:01.130827 4816 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 16 00:08:01 crc kubenswrapper[4816]: E0316 00:08:01.231689 4816 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 16 00:08:01 crc kubenswrapper[4816]: I0316 00:08:01.292367 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 16 00:08:01 crc kubenswrapper[4816]: I0316 00:08:01.292591 4816 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 16 00:08:01 crc kubenswrapper[4816]: I0316 00:08:01.294187 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:01 crc kubenswrapper[4816]: I0316 00:08:01.294278 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:01 crc kubenswrapper[4816]: I0316 00:08:01.294352 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:01 crc kubenswrapper[4816]: I0316 00:08:01.295740 4816 scope.go:117] "RemoveContainer" containerID="29c4319e41bd025417b11cb87b682a66698ba7575801f247b1192dc8067ab66f" Mar 16 00:08:01 crc kubenswrapper[4816]: E0316 00:08:01.296107 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 16 00:08:01 crc kubenswrapper[4816]: E0316 00:08:01.332889 4816 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 16 00:08:01 crc kubenswrapper[4816]: E0316 00:08:01.433881 4816 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 16 00:08:01 crc kubenswrapper[4816]: E0316 00:08:01.534621 4816 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 16 00:08:01 crc kubenswrapper[4816]: E0316 00:08:01.635038 4816 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 16 00:08:01 crc kubenswrapper[4816]: E0316 00:08:01.735248 4816 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 16 00:08:01 crc kubenswrapper[4816]: E0316 00:08:01.836357 4816 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 16 00:08:01 crc kubenswrapper[4816]: E0316 00:08:01.936855 4816 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 16 00:08:02 crc kubenswrapper[4816]: E0316 00:08:02.037674 4816 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 16 00:08:02 crc kubenswrapper[4816]: E0316 00:08:02.138811 4816 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 16 00:08:02 crc kubenswrapper[4816]: E0316 00:08:02.239276 4816 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 16 00:08:02 crc kubenswrapper[4816]: E0316 00:08:02.340124 4816 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 16 00:08:02 crc kubenswrapper[4816]: E0316 00:08:02.440892 4816 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 16 00:08:02 crc kubenswrapper[4816]: E0316 00:08:02.541661 4816 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 16 00:08:02 crc kubenswrapper[4816]: E0316 00:08:02.642660 4816 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 16 00:08:02 crc kubenswrapper[4816]: E0316 00:08:02.743974 4816 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 16 00:08:02 crc kubenswrapper[4816]: E0316 00:08:02.844665 4816 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 16 00:08:02 crc kubenswrapper[4816]: E0316 00:08:02.945153 4816 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 16 00:08:03 crc kubenswrapper[4816]: E0316 00:08:03.046225 4816 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 16 00:08:03 crc kubenswrapper[4816]: E0316 00:08:03.146789 4816 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 16 00:08:03 crc kubenswrapper[4816]: E0316 00:08:03.247308 4816 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 16 00:08:03 crc kubenswrapper[4816]: E0316 00:08:03.348181 4816 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 16 00:08:03 crc kubenswrapper[4816]: E0316 00:08:03.449161 4816 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 16 00:08:03 crc kubenswrapper[4816]: E0316 00:08:03.550332 4816 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 16 00:08:03 crc kubenswrapper[4816]: E0316 00:08:03.651269 4816 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 16 00:08:03 crc kubenswrapper[4816]: E0316 00:08:03.752038 4816 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 16 00:08:03 crc kubenswrapper[4816]: E0316 00:08:03.853523 4816 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 16 00:08:03 crc kubenswrapper[4816]: E0316 00:08:03.954126 4816 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 16 00:08:04 crc kubenswrapper[4816]: E0316 00:08:04.054836 4816 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 16 00:08:04 crc kubenswrapper[4816]: E0316 00:08:04.155700 4816 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 16 00:08:04 crc kubenswrapper[4816]: E0316 00:08:04.256260 4816 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 16 00:08:04 crc kubenswrapper[4816]: E0316 00:08:04.357526 4816 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 16 00:08:04 crc kubenswrapper[4816]: E0316 00:08:04.458033 4816 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 16 00:08:04 crc kubenswrapper[4816]: E0316 00:08:04.559439 4816 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 16 00:08:04 crc kubenswrapper[4816]: E0316 00:08:04.659993 4816 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 16 00:08:04 crc kubenswrapper[4816]: I0316 00:08:04.666842 4816 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 16 00:08:04 crc kubenswrapper[4816]: I0316 00:08:04.668586 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:04 crc kubenswrapper[4816]: I0316 00:08:04.668835 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:04 crc kubenswrapper[4816]: I0316 00:08:04.668983 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:04 crc kubenswrapper[4816]: E0316 00:08:04.760649 4816 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 16 00:08:04 crc kubenswrapper[4816]: E0316 00:08:04.860828 4816 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 16 00:08:04 crc kubenswrapper[4816]: E0316 00:08:04.961492 4816 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 16 00:08:05 crc kubenswrapper[4816]: E0316 00:08:05.062296 4816 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 16 00:08:05 crc kubenswrapper[4816]: E0316 00:08:05.163078 4816 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 16 00:08:05 crc kubenswrapper[4816]: E0316 00:08:05.263701 4816 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 16 00:08:05 crc kubenswrapper[4816]: E0316 00:08:05.363858 4816 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 16 00:08:05 crc kubenswrapper[4816]: E0316 00:08:05.464010 4816 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 16 00:08:05 crc kubenswrapper[4816]: E0316 00:08:05.564523 4816 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 16 00:08:05 crc kubenswrapper[4816]: E0316 00:08:05.665573 4816 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 16 00:08:05 crc kubenswrapper[4816]: E0316 00:08:05.766201 4816 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 16 00:08:05 crc kubenswrapper[4816]: E0316 00:08:05.866683 4816 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 16 00:08:05 crc kubenswrapper[4816]: E0316 00:08:05.967834 4816 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 16 00:08:06 crc kubenswrapper[4816]: E0316 00:08:06.068600 4816 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 16 00:08:06 crc kubenswrapper[4816]: E0316 00:08:06.169082 4816 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 16 00:08:06 crc kubenswrapper[4816]: E0316 00:08:06.269640 4816 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 16 00:08:06 crc kubenswrapper[4816]: E0316 00:08:06.370630 4816 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 16 00:08:06 crc kubenswrapper[4816]: E0316 00:08:06.471647 4816 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 16 00:08:06 crc kubenswrapper[4816]: E0316 00:08:06.572936 4816 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 16 00:08:06 crc kubenswrapper[4816]: E0316 00:08:06.673623 4816 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 16 00:08:06 crc kubenswrapper[4816]: E0316 00:08:06.774613 4816 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 16 00:08:06 crc kubenswrapper[4816]: I0316 00:08:06.808116 4816 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 16 00:08:06 crc kubenswrapper[4816]: I0316 00:08:06.808377 4816 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 16 00:08:06 crc kubenswrapper[4816]: I0316 00:08:06.809999 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:06 crc kubenswrapper[4816]: I0316 00:08:06.810073 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:06 crc kubenswrapper[4816]: I0316 00:08:06.810102 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:06 crc kubenswrapper[4816]: I0316 00:08:06.811067 4816 scope.go:117] "RemoveContainer" containerID="29c4319e41bd025417b11cb87b682a66698ba7575801f247b1192dc8067ab66f" Mar 16 00:08:06 crc kubenswrapper[4816]: E0316 00:08:06.811348 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 16 00:08:06 crc kubenswrapper[4816]: E0316 00:08:06.875144 4816 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 16 00:08:06 crc kubenswrapper[4816]: E0316 00:08:06.975868 4816 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 16 00:08:07 crc kubenswrapper[4816]: E0316 00:08:07.076360 4816 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 16 00:08:07 crc kubenswrapper[4816]: E0316 00:08:07.176845 4816 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 16 00:08:07 crc kubenswrapper[4816]: E0316 00:08:07.277898 4816 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 16 00:08:07 crc kubenswrapper[4816]: E0316 00:08:07.378659 4816 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 16 00:08:07 crc kubenswrapper[4816]: E0316 00:08:07.479847 4816 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 16 00:08:07 crc kubenswrapper[4816]: E0316 00:08:07.580974 4816 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 16 00:08:07 crc kubenswrapper[4816]: E0316 00:08:07.681386 4816 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 16 00:08:07 crc kubenswrapper[4816]: I0316 00:08:07.690607 4816 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Mar 16 00:08:07 crc kubenswrapper[4816]: E0316 00:08:07.771347 4816 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 16 00:08:07 crc kubenswrapper[4816]: E0316 00:08:07.781610 4816 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 16 00:08:07 crc kubenswrapper[4816]: E0316 00:08:07.882199 4816 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 16 00:08:07 crc kubenswrapper[4816]: E0316 00:08:07.982624 4816 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 16 00:08:08 crc kubenswrapper[4816]: E0316 00:08:08.083773 4816 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 16 00:08:08 crc kubenswrapper[4816]: E0316 00:08:08.184628 4816 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 16 00:08:08 crc kubenswrapper[4816]: E0316 00:08:08.219538 4816 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": node \"crc\" not found" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.225481 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.225544 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.225599 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.225635 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.225660 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:08Z","lastTransitionTime":"2026-03-16T00:08:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:08 crc kubenswrapper[4816]: E0316 00:08:08.245216 4816 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d8fb348c-4907-4c8f-859a-735976530e03\\\",\\\"systemUUID\\\":\\\"e97abf98-298f-4589-a70a-4cfb5cb2994a\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.250453 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.250503 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.250524 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.250583 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.250605 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:08Z","lastTransitionTime":"2026-03-16T00:08:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:08 crc kubenswrapper[4816]: E0316 00:08:08.268234 4816 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d8fb348c-4907-4c8f-859a-735976530e03\\\",\\\"systemUUID\\\":\\\"e97abf98-298f-4589-a70a-4cfb5cb2994a\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.274167 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.274234 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.274249 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.274276 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.274293 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:08Z","lastTransitionTime":"2026-03-16T00:08:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:08 crc kubenswrapper[4816]: E0316 00:08:08.289363 4816 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d8fb348c-4907-4c8f-859a-735976530e03\\\",\\\"systemUUID\\\":\\\"e97abf98-298f-4589-a70a-4cfb5cb2994a\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.295277 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.295354 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.295373 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.295398 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.295416 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:08Z","lastTransitionTime":"2026-03-16T00:08:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:08 crc kubenswrapper[4816]: E0316 00:08:08.311770 4816 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d8fb348c-4907-4c8f-859a-735976530e03\\\",\\\"systemUUID\\\":\\\"e97abf98-298f-4589-a70a-4cfb5cb2994a\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 16 00:08:08 crc kubenswrapper[4816]: E0316 00:08:08.312021 4816 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 16 00:08:08 crc kubenswrapper[4816]: E0316 00:08:08.312075 4816 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 16 00:08:08 crc kubenswrapper[4816]: E0316 00:08:08.412449 4816 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 16 00:08:08 crc kubenswrapper[4816]: E0316 00:08:08.513340 4816 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.592911 4816 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.607671 4816 apiserver.go:52] "Watching apiserver" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.614312 4816 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.614725 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-network-operator/iptables-alerter-4ln5h","openshift-network-operator/network-operator-58b4c7f79c-55gtf","openshift-network-console/networking-console-plugin-85b44fc459-gdk6g","openshift-network-diagnostics/network-check-source-55646444c4-trplf","openshift-network-diagnostics/network-check-target-xd92c","openshift-network-node-identity/network-node-identity-vrzqb"] Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.615348 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.615873 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.616008 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.616240 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 16 00:08:08 crc kubenswrapper[4816]: E0316 00:08:08.616305 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 16 00:08:08 crc kubenswrapper[4816]: E0316 00:08:08.616360 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.616385 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.617114 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.617156 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.617216 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:08 crc kubenswrapper[4816]: E0316 00:08:08.617211 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.617233 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.617289 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.617310 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:08Z","lastTransitionTime":"2026-03-16T00:08:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.620945 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.621233 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.621307 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.621612 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.624868 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.625480 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.625475 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.626959 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.630670 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.670273 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.686223 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.697013 4816 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.704658 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.719880 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.720649 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.720687 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.720700 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.720721 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.720735 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:08Z","lastTransitionTime":"2026-03-16T00:08:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.740736 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.747726 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.747775 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.747811 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.747846 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.748009 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.748047 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.748081 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.748115 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.748149 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.748186 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.748220 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.748250 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.748281 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.748275 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" (OuterVolumeSpecName: "kube-api-access-mg5zb") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "kube-api-access-mg5zb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.748312 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.748371 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.748411 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.748446 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.748481 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.748516 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.748577 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.748609 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.748645 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.748675 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.748709 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.748740 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.748775 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.748782 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.748843 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" (OuterVolumeSpecName: "kube-api-access-fcqwp") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "kube-api-access-fcqwp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.748806 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 16 00:08:08 crc kubenswrapper[4816]: E0316 00:08:08.748948 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-16 00:08:09.248902793 +0000 UTC m=+82.345202786 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.749015 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.749036 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.749072 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.749114 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.749153 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.749196 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.749226 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.749248 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.749302 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.749359 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.749396 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.749404 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" (OuterVolumeSpecName: "machine-api-operator-tls") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "machine-api-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.749434 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.749472 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.749506 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.749585 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.749625 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.749639 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.749666 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.749720 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.749763 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.749773 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.749804 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.749897 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.749934 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.749971 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.750004 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.750037 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.750075 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.750110 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.750145 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.750181 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.750214 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.750246 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.750277 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.750308 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.750355 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.750398 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.750436 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.750468 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.750502 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.750517 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" (OuterVolumeSpecName: "kube-api-access-pcxfs") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "kube-api-access-pcxfs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.750539 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") pod \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\" (UID: \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\") " Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.750616 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.750597 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.750667 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.750707 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.750744 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.750778 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.750842 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.750882 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.750915 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.750948 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.750982 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.751016 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.751071 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.751105 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.751142 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.751176 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.751207 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.751239 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.751270 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.751303 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.751340 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.751373 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.751410 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.751445 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.751485 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") pod \"49ef4625-1d3a-4a9f-b595-c2433d32326d\" (UID: \"49ef4625-1d3a-4a9f-b595-c2433d32326d\") " Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.751485 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" (OuterVolumeSpecName: "mcc-auth-proxy-config") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "mcc-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.751526 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.751540 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.751685 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" (OuterVolumeSpecName: "audit") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "audit". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.751855 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.752029 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.752099 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.751588 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.752489 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.752540 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.752603 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.752612 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.752636 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.752789 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.752822 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.752849 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.752872 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.752896 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.752930 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.752951 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.752970 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.752997 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.753018 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.753007 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.753129 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.753038 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.753224 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.753293 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.753351 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.753361 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" (OuterVolumeSpecName: "config") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.753394 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.753413 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.753470 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.753536 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.753631 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.753686 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.753741 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.753760 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.753802 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.753858 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.753916 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.753853 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" (OuterVolumeSpecName: "kube-api-access-fqsjt") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "kube-api-access-fqsjt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.753976 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" (OuterVolumeSpecName: "kube-api-access-ngvvp") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "kube-api-access-ngvvp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.753874 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" (OuterVolumeSpecName: "kube-api-access-xcgwh") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "kube-api-access-xcgwh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.754067 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.754129 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.754183 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.754234 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.754239 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.754286 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.754340 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.754386 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.754428 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.754524 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.754596 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.754639 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.754678 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.754715 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.754759 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.754779 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.754789 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" (OuterVolumeSpecName: "default-certificate") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "default-certificate". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.754902 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.754951 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" (OuterVolumeSpecName: "kube-api-access-xcphl") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "kube-api-access-xcphl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.754989 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.755175 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.755223 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.755261 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.755302 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.755340 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.755384 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.755441 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.755489 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.756080 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.756146 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.756210 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.756269 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.756326 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.756386 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.756444 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.756498 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.756535 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.756623 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.756678 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.756795 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.756863 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.756911 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.756970 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.757018 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.757075 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.757133 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.757190 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.757246 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.757308 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.757367 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.757426 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.757490 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.757592 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.757652 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.757703 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.757679 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.757747 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.757790 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.757833 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.757871 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.757910 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") pod \"44663579-783b-4372-86d6-acf235a62d72\" (UID: \"44663579-783b-4372-86d6-acf235a62d72\") " Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.757949 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.757988 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.758027 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.758064 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.758102 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") pod \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\" (UID: \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\") " Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.758146 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.758205 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.758251 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.758289 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.758328 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.758397 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.758461 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.758517 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.758590 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.758630 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.758674 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.758714 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.758754 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.758792 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.758866 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.758906 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.758952 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.759010 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.759114 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.759183 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.759244 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.759305 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.759363 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.759427 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.759497 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.759612 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.759677 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.759855 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.759950 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.760102 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.760187 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.760227 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.760335 4816 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.760362 4816 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.760387 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") on node \"crc\" DevicePath \"\"" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.760409 4816 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.760432 4816 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.760456 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") on node \"crc\" DevicePath \"\"" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.760480 4816 reconciler_common.go:293] "Volume detached for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") on node \"crc\" DevicePath \"\"" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.760502 4816 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") on node \"crc\" DevicePath \"\"" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.760525 4816 reconciler_common.go:293] "Volume detached for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.760583 4816 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.760608 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") on node \"crc\" DevicePath \"\"" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.760631 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") on node \"crc\" DevicePath \"\"" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.760656 4816 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.760678 4816 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.760699 4816 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.760721 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") on node \"crc\" DevicePath \"\"" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.760745 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") on node \"crc\" DevicePath \"\"" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.760766 4816 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") on node \"crc\" DevicePath \"\"" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.760786 4816 reconciler_common.go:293] "Volume detached for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") on node \"crc\" DevicePath \"\"" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.760808 4816 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") on node \"crc\" DevicePath \"\"" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.760829 4816 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") on node \"crc\" DevicePath \"\"" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.760850 4816 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") on node \"crc\" DevicePath \"\"" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.760874 4816 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") on node \"crc\" DevicePath \"\"" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.760895 4816 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.760917 4816 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.760941 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") on node \"crc\" DevicePath \"\"" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.760962 4816 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") on node \"crc\" DevicePath \"\"" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.760983 4816 reconciler_common.go:293] "Volume detached for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") on node \"crc\" DevicePath \"\"" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.761005 4816 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.761028 4816 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.762350 4816 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.755943 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" (OuterVolumeSpecName: "kube-api-access-cfbct") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "kube-api-access-cfbct". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.756041 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.756244 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" (OuterVolumeSpecName: "cni-sysctl-allowlist") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-sysctl-allowlist". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.756284 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" (OuterVolumeSpecName: "kube-api-access-2d4wz") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "kube-api-access-2d4wz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.755464 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" (OuterVolumeSpecName: "samples-operator-tls") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "samples-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.756317 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" (OuterVolumeSpecName: "config-volume") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.756346 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.756462 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.756726 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" (OuterVolumeSpecName: "images") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.756749 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" (OuterVolumeSpecName: "kube-api-access-v47cf") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "kube-api-access-v47cf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.756779 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" (OuterVolumeSpecName: "kube-api-access-pj782") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "kube-api-access-pj782". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.757067 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.757105 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.757438 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" (OuterVolumeSpecName: "utilities") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.763213 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" (OuterVolumeSpecName: "kube-api-access-7c4vf") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "kube-api-access-7c4vf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.763227 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" (OuterVolumeSpecName: "multus-daemon-config") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "multus-daemon-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.758137 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" (OuterVolumeSpecName: "apiservice-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "apiservice-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.758125 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.758405 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" (OuterVolumeSpecName: "config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.758422 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" (OuterVolumeSpecName: "webhook-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "webhook-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.758503 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" (OuterVolumeSpecName: "node-bootstrap-token") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "node-bootstrap-token". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.759007 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" (OuterVolumeSpecName: "config") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.759049 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" (OuterVolumeSpecName: "kube-api-access-jkwtn") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "kube-api-access-jkwtn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.759054 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" (OuterVolumeSpecName: "machine-approver-tls") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "machine-approver-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.759092 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" (OuterVolumeSpecName: "service-ca") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.759122 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" (OuterVolumeSpecName: "config") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.759192 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.759905 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.760172 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" (OuterVolumeSpecName: "kube-api-access-w7l8j") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "kube-api-access-w7l8j". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.760212 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" (OuterVolumeSpecName: "etcd-service-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.760239 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.760841 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.760895 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" (OuterVolumeSpecName: "config") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.761910 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.762065 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" (OuterVolumeSpecName: "kube-api-access-6g6sz") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "kube-api-access-6g6sz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.762172 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.762537 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" (OuterVolumeSpecName: "kube-api-access-kfwg7") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "kube-api-access-kfwg7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.763277 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.763518 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" (OuterVolumeSpecName: "kube-api-access-6ccd8") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "kube-api-access-6ccd8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.763607 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.763736 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" (OuterVolumeSpecName: "kube-api-access-lzf88") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "kube-api-access-lzf88". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.763600 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.764046 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" (OuterVolumeSpecName: "utilities") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.764603 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.764788 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.764942 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" (OuterVolumeSpecName: "kube-api-access-279lb") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "kube-api-access-279lb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.765052 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" (OuterVolumeSpecName: "config") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.765080 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" (OuterVolumeSpecName: "kube-api-access-w9rds") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "kube-api-access-w9rds". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.765191 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" (OuterVolumeSpecName: "cert") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.765294 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.765752 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" (OuterVolumeSpecName: "kube-api-access-x4zgh") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "kube-api-access-x4zgh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.765906 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" (OuterVolumeSpecName: "kube-api-access-s4n52") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "kube-api-access-s4n52". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.766597 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.766657 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" (OuterVolumeSpecName: "webhook-certs") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "webhook-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 16 00:08:08 crc kubenswrapper[4816]: E0316 00:08:08.767197 4816 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.767584 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.768061 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" (OuterVolumeSpecName: "package-server-manager-serving-cert") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "package-server-manager-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.768371 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.768371 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" (OuterVolumeSpecName: "kube-api-access-lz9wn") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "kube-api-access-lz9wn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.768526 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.769107 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" (OuterVolumeSpecName: "signing-key") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.769401 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.769458 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" (OuterVolumeSpecName: "stats-auth") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "stats-auth". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.769656 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" (OuterVolumeSpecName: "mcd-auth-proxy-config") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "mcd-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.769843 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.769905 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.769920 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" (OuterVolumeSpecName: "kube-api-access-gf66m") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "kube-api-access-gf66m". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.770337 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" (OuterVolumeSpecName: "kube-api-access-d6qdx") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "kube-api-access-d6qdx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.770498 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.770866 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.771144 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" (OuterVolumeSpecName: "images") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.771421 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" (OuterVolumeSpecName: "kube-api-access-4d4hj") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "kube-api-access-4d4hj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:08:08 crc kubenswrapper[4816]: E0316 00:08:08.773221 4816 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 16 00:08:08 crc kubenswrapper[4816]: E0316 00:08:08.773321 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-16 00:08:09.273294046 +0000 UTC m=+82.369594039 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.774514 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.770951 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" (OuterVolumeSpecName: "kube-api-access-8tdtz") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "kube-api-access-8tdtz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:08:08 crc kubenswrapper[4816]: E0316 00:08:08.777358 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-16 00:08:09.277334241 +0000 UTC m=+82.373634204 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.769693 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.784338 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" (OuterVolumeSpecName: "kube-api-access-zkvpv") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "kube-api-access-zkvpv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.784865 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.785348 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" (OuterVolumeSpecName: "kube-api-access-jhbk2") pod "bd23aa5c-e532-4e53-bccf-e79f130c5ae8" (UID: "bd23aa5c-e532-4e53-bccf-e79f130c5ae8"). InnerVolumeSpecName "kube-api-access-jhbk2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.787852 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" (OuterVolumeSpecName: "console-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.788147 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" (OuterVolumeSpecName: "kube-api-access-pjr6v") pod "49ef4625-1d3a-4a9f-b595-c2433d32326d" (UID: "49ef4625-1d3a-4a9f-b595-c2433d32326d"). InnerVolumeSpecName "kube-api-access-pjr6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:08:08 crc kubenswrapper[4816]: E0316 00:08:08.789724 4816 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 16 00:08:08 crc kubenswrapper[4816]: E0316 00:08:08.789774 4816 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 16 00:08:08 crc kubenswrapper[4816]: E0316 00:08:08.789804 4816 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 16 00:08:08 crc kubenswrapper[4816]: E0316 00:08:08.789922 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-16 00:08:09.289884299 +0000 UTC m=+82.386184432 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 16 00:08:08 crc kubenswrapper[4816]: E0316 00:08:08.792453 4816 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 16 00:08:08 crc kubenswrapper[4816]: E0316 00:08:08.792492 4816 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 16 00:08:08 crc kubenswrapper[4816]: E0316 00:08:08.792520 4816 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 16 00:08:08 crc kubenswrapper[4816]: E0316 00:08:08.792645 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-16 00:08:09.29261707 +0000 UTC m=+82.388917203 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.792683 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.792767 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.792946 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" (OuterVolumeSpecName: "available-featuregates") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "available-featuregates". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.793139 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.793257 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" (OuterVolumeSpecName: "kube-api-access-htfz6") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "kube-api-access-htfz6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.793376 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" (OuterVolumeSpecName: "certs") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.793467 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" (OuterVolumeSpecName: "control-plane-machine-set-operator-tls") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "control-plane-machine-set-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.793506 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.793673 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" (OuterVolumeSpecName: "kube-api-access-x7zkh") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "kube-api-access-x7zkh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.794448 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.795250 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.796399 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" (OuterVolumeSpecName: "kube-api-access-qg5z5") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "kube-api-access-qg5z5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.796843 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.798819 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" (OuterVolumeSpecName: "kube-api-access-9xfj7") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "kube-api-access-9xfj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.798830 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.799419 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" (OuterVolumeSpecName: "kube-api-access-qs4fp") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "kube-api-access-qs4fp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.799811 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.799829 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" (OuterVolumeSpecName: "kube-api-access-nzwt7") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "kube-api-access-nzwt7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.799913 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.800304 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.800512 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" (OuterVolumeSpecName: "kube-api-access-wxkg8") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "kube-api-access-wxkg8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.800842 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" (OuterVolumeSpecName: "config") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.800903 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" (OuterVolumeSpecName: "serviceca") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "serviceca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.801434 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.802618 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" (OuterVolumeSpecName: "signing-cabundle") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-cabundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.802978 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.803001 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" (OuterVolumeSpecName: "kube-api-access-tk88c") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "kube-api-access-tk88c". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.803164 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.803633 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.803853 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" (OuterVolumeSpecName: "config") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.804650 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.805901 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.805987 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.806146 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" (OuterVolumeSpecName: "kube-api-access-w4xd4") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "kube-api-access-w4xd4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.806326 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" (OuterVolumeSpecName: "config") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.806960 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.807114 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" (OuterVolumeSpecName: "config") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.807147 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" (OuterVolumeSpecName: "kube-api-access-dbsvg") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "kube-api-access-dbsvg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.809617 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" (OuterVolumeSpecName: "config") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.810027 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" (OuterVolumeSpecName: "config") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.810305 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.811864 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" (OuterVolumeSpecName: "kube-api-access-249nr") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "kube-api-access-249nr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.811906 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.812092 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" (OuterVolumeSpecName: "client-ca") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.812223 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" (OuterVolumeSpecName: "config") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.812596 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.813443 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.813727 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.814297 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.814770 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.814893 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.815017 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" (OuterVolumeSpecName: "kube-api-access-bf2bz") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "kube-api-access-bf2bz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.815041 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" (OuterVolumeSpecName: "config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.815079 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" (OuterVolumeSpecName: "config") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.815231 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" (OuterVolumeSpecName: "etcd-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.815720 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.816175 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.816187 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" (OuterVolumeSpecName: "kube-api-access-sb6h7") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "kube-api-access-sb6h7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.816952 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" (OuterVolumeSpecName: "image-import-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "image-import-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.816995 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" (OuterVolumeSpecName: "kube-api-access-vt5rc") pod "44663579-783b-4372-86d6-acf235a62d72" (UID: "44663579-783b-4372-86d6-acf235a62d72"). InnerVolumeSpecName "kube-api-access-vt5rc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.816898 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" (OuterVolumeSpecName: "kube-api-access-rnphk") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "kube-api-access-rnphk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.817007 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.818881 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.819215 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.819371 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" (OuterVolumeSpecName: "utilities") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.819430 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" (OuterVolumeSpecName: "utilities") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.820019 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" (OuterVolumeSpecName: "kube-api-access-d4lsv") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "kube-api-access-d4lsv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.820148 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" (OuterVolumeSpecName: "kube-api-access-2w9zh") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "kube-api-access-2w9zh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.820157 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" (OuterVolumeSpecName: "kube-api-access-x2m85") pod "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" (UID: "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d"). InnerVolumeSpecName "kube-api-access-x2m85". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.820430 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.820451 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" (OuterVolumeSpecName: "kube-api-access-zgdk5") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "kube-api-access-zgdk5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.820660 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.820805 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.820905 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" (OuterVolumeSpecName: "kube-api-access-mnrrd") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "kube-api-access-mnrrd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.821036 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" (OuterVolumeSpecName: "image-registry-operator-tls") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "image-registry-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.821050 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.821407 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.821474 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" (OuterVolumeSpecName: "client-ca") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.821489 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.821523 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" (OuterVolumeSpecName: "service-ca") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.821604 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.822147 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.822249 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" (OuterVolumeSpecName: "ovn-control-plane-metrics-cert") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovn-control-plane-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.822359 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" (OuterVolumeSpecName: "tmpfs") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "tmpfs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.822727 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.822760 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.822804 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.822824 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.822841 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.822893 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.822837 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:08Z","lastTransitionTime":"2026-03-16T00:08:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.822957 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.824325 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.824680 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.825045 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.842632 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.845981 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.858035 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.860679 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.861937 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.861992 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.862054 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.862085 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") on node \"crc\" DevicePath \"\"" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.862110 4816 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.862129 4816 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") on node \"crc\" DevicePath \"\"" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.862150 4816 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") on node \"crc\" DevicePath \"\"" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.862166 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") on node \"crc\" DevicePath \"\"" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.862186 4816 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") on node \"crc\" DevicePath \"\"" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.862204 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") on node \"crc\" DevicePath \"\"" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.862220 4816 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") on node \"crc\" DevicePath \"\"" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.862121 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.862235 4816 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") on node \"crc\" DevicePath \"\"" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.862311 4816 reconciler_common.go:293] "Volume detached for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") on node \"crc\" DevicePath \"\"" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.862338 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") on node \"crc\" DevicePath \"\"" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.862358 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") on node \"crc\" DevicePath \"\"" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.862379 4816 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") on node \"crc\" DevicePath \"\"" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.862398 4816 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") on node \"crc\" DevicePath \"\"" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.862417 4816 reconciler_common.go:293] "Volume detached for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") on node \"crc\" DevicePath \"\"" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.862437 4816 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") on node \"crc\" DevicePath \"\"" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.862457 4816 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") on node \"crc\" DevicePath \"\"" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.862474 4816 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") on node \"crc\" DevicePath \"\"" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.862493 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") on node \"crc\" DevicePath \"\"" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.862512 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") on node \"crc\" DevicePath \"\"" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.862531 4816 reconciler_common.go:293] "Volume detached for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") on node \"crc\" DevicePath \"\"" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.862573 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") on node \"crc\" DevicePath \"\"" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.862593 4816 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") on node \"crc\" DevicePath \"\"" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.862611 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") on node \"crc\" DevicePath \"\"" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.862636 4816 reconciler_common.go:293] "Volume detached for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") on node \"crc\" DevicePath \"\"" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.862654 4816 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.862674 4816 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") on node \"crc\" DevicePath \"\"" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.862692 4816 reconciler_common.go:293] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") on node \"crc\" DevicePath \"\"" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.862710 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") on node \"crc\" DevicePath \"\"" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.862729 4816 reconciler_common.go:293] "Volume detached for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") on node \"crc\" DevicePath \"\"" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.862748 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") on node \"crc\" DevicePath \"\"" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.862801 4816 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") on node \"crc\" DevicePath \"\"" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.862818 4816 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") on node \"crc\" DevicePath \"\"" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.862836 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") on node \"crc\" DevicePath \"\"" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.862897 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") on node \"crc\" DevicePath \"\"" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.862919 4816 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.862937 4816 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.862956 4816 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.862977 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") on node \"crc\" DevicePath \"\"" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.862996 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.863015 4816 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.863035 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") on node \"crc\" DevicePath \"\"" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.863055 4816 reconciler_common.go:293] "Volume detached for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.863076 4816 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.863097 4816 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") on node \"crc\" DevicePath \"\"" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.863117 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") on node \"crc\" DevicePath \"\"" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.863136 4816 reconciler_common.go:293] "Volume detached for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") on node \"crc\" DevicePath \"\"" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.863155 4816 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") on node \"crc\" DevicePath \"\"" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.863174 4816 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.863194 4816 reconciler_common.go:293] "Volume detached for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") on node \"crc\" DevicePath \"\"" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.863212 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.863231 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") on node \"crc\" DevicePath \"\"" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.863249 4816 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") on node \"crc\" DevicePath \"\"" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.863267 4816 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") on node \"crc\" DevicePath \"\"" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.863285 4816 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.863302 4816 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") on node \"crc\" DevicePath \"\"" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.863319 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") on node \"crc\" DevicePath \"\"" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.863338 4816 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.863356 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") on node \"crc\" DevicePath \"\"" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.863375 4816 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.863395 4816 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") on node \"crc\" DevicePath \"\"" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.863414 4816 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.863434 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") on node \"crc\" DevicePath \"\"" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.863454 4816 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.863471 4816 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.863490 4816 reconciler_common.go:293] "Volume detached for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") on node \"crc\" DevicePath \"\"" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.863507 4816 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") on node \"crc\" DevicePath \"\"" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.863525 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") on node \"crc\" DevicePath \"\"" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.863545 4816 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") on node \"crc\" DevicePath \"\"" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.863585 4816 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.863605 4816 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.863627 4816 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") on node \"crc\" DevicePath \"\"" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.863646 4816 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.863665 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") on node \"crc\" DevicePath \"\"" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.863685 4816 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") on node \"crc\" DevicePath \"\"" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.863705 4816 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.863724 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") on node \"crc\" DevicePath \"\"" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.863743 4816 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.863762 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") on node \"crc\" DevicePath \"\"" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.863784 4816 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.863805 4816 reconciler_common.go:293] "Volume detached for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") on node \"crc\" DevicePath \"\"" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.863824 4816 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.863842 4816 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.863860 4816 reconciler_common.go:293] "Volume detached for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") on node \"crc\" DevicePath \"\"" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.863877 4816 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") on node \"crc\" DevicePath \"\"" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.863894 4816 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") on node \"crc\" DevicePath \"\"" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.863911 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") on node \"crc\" DevicePath \"\"" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.863930 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") on node \"crc\" DevicePath \"\"" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.863947 4816 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.863965 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") on node \"crc\" DevicePath \"\"" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.863983 4816 reconciler_common.go:293] "Volume detached for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") on node \"crc\" DevicePath \"\"" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.864002 4816 reconciler_common.go:293] "Volume detached for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") on node \"crc\" DevicePath \"\"" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.864020 4816 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") on node \"crc\" DevicePath \"\"" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.864037 4816 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.864054 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") on node \"crc\" DevicePath \"\"" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.864071 4816 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.864089 4816 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") on node \"crc\" DevicePath \"\"" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.864106 4816 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.864124 4816 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") on node \"crc\" DevicePath \"\"" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.864143 4816 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") on node \"crc\" DevicePath \"\"" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.864160 4816 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") on node \"crc\" DevicePath \"\"" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.864177 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") on node \"crc\" DevicePath \"\"" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.864195 4816 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.864212 4816 reconciler_common.go:293] "Volume detached for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") on node \"crc\" DevicePath \"\"" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.864229 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") on node \"crc\" DevicePath \"\"" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.864246 4816 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") on node \"crc\" DevicePath \"\"" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.864264 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") on node \"crc\" DevicePath \"\"" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.864281 4816 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") on node \"crc\" DevicePath \"\"" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.864298 4816 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.864316 4816 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.864334 4816 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") on node \"crc\" DevicePath \"\"" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.864352 4816 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.864370 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") on node \"crc\" DevicePath \"\"" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.864388 4816 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.864404 4816 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.864422 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") on node \"crc\" DevicePath \"\"" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.864440 4816 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.864458 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") on node \"crc\" DevicePath \"\"" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.864476 4816 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.864493 4816 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") on node \"crc\" DevicePath \"\"" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.864509 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") on node \"crc\" DevicePath \"\"" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.864527 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") on node \"crc\" DevicePath \"\"" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.864545 4816 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") on node \"crc\" DevicePath \"\"" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.864582 4816 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.864602 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") on node \"crc\" DevicePath \"\"" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.864620 4816 reconciler_common.go:293] "Volume detached for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") on node \"crc\" DevicePath \"\"" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.864637 4816 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") on node \"crc\" DevicePath \"\"" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.864655 4816 reconciler_common.go:293] "Volume detached for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") on node \"crc\" DevicePath \"\"" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.864674 4816 reconciler_common.go:293] "Volume detached for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.864692 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.864713 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") on node \"crc\" DevicePath \"\"" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.864730 4816 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") on node \"crc\" DevicePath \"\"" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.864748 4816 reconciler_common.go:293] "Volume detached for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") on node \"crc\" DevicePath \"\"" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.864767 4816 reconciler_common.go:293] "Volume detached for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") on node \"crc\" DevicePath \"\"" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.864786 4816 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.864803 4816 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.864820 4816 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") on node \"crc\" DevicePath \"\"" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.864837 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.864854 4816 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.864872 4816 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.864891 4816 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.864908 4816 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") on node \"crc\" DevicePath \"\"" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.864925 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") on node \"crc\" DevicePath \"\"" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.864942 4816 reconciler_common.go:293] "Volume detached for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") on node \"crc\" DevicePath \"\"" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.864961 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") on node \"crc\" DevicePath \"\"" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.864977 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") on node \"crc\" DevicePath \"\"" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.864999 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") on node \"crc\" DevicePath \"\"" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.865017 4816 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") on node \"crc\" DevicePath \"\"" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.865034 4816 reconciler_common.go:293] "Volume detached for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") on node \"crc\" DevicePath \"\"" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.865051 4816 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") on node \"crc\" DevicePath \"\"" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.865068 4816 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.865086 4816 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.865103 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") on node \"crc\" DevicePath \"\"" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.865120 4816 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") on node \"crc\" DevicePath \"\"" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.865138 4816 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") on node \"crc\" DevicePath \"\"" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.865156 4816 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.865174 4816 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") on node \"crc\" DevicePath \"\"" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.865191 4816 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.865207 4816 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") on node \"crc\" DevicePath \"\"" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.865226 4816 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.865243 4816 reconciler_common.go:293] "Volume detached for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") on node \"crc\" DevicePath \"\"" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.865260 4816 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.865277 4816 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") on node \"crc\" DevicePath \"\"" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.865295 4816 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") on node \"crc\" DevicePath \"\"" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.927045 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.927090 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.927123 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.927146 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.927158 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:08Z","lastTransitionTime":"2026-03-16T00:08:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.939275 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.947359 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 16 00:08:08 crc kubenswrapper[4816]: I0316 00:08:08.953487 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 16 00:08:08 crc kubenswrapper[4816]: E0316 00:08:08.967199 4816 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 16 00:08:08 crc kubenswrapper[4816]: container &Container{Name:network-operator,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,Command:[/bin/bash -c #!/bin/bash Mar 16 00:08:08 crc kubenswrapper[4816]: set -o allexport Mar 16 00:08:08 crc kubenswrapper[4816]: if [[ -f /etc/kubernetes/apiserver-url.env ]]; then Mar 16 00:08:08 crc kubenswrapper[4816]: source /etc/kubernetes/apiserver-url.env Mar 16 00:08:08 crc kubenswrapper[4816]: else Mar 16 00:08:08 crc kubenswrapper[4816]: echo "Error: /etc/kubernetes/apiserver-url.env is missing" Mar 16 00:08:08 crc kubenswrapper[4816]: exit 1 Mar 16 00:08:08 crc kubenswrapper[4816]: fi Mar 16 00:08:08 crc kubenswrapper[4816]: exec /usr/bin/cluster-network-operator start --listen=0.0.0.0:9104 Mar 16 00:08:08 crc kubenswrapper[4816]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:cno,HostPort:9104,ContainerPort:9104,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:RELEASE_VERSION,Value:4.18.1,ValueFrom:nil,},EnvVar{Name:KUBE_PROXY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b97554198294bf544fbc116c94a0a1fb2ec8a4de0e926bf9d9e320135f0bee6f,ValueFrom:nil,},EnvVar{Name:KUBE_RBAC_PROXY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09,ValueFrom:nil,},EnvVar{Name:MULTUS_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26,ValueFrom:nil,},EnvVar{Name:MULTUS_ADMISSION_CONTROLLER_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317,ValueFrom:nil,},EnvVar{Name:CNI_PLUGINS_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc,ValueFrom:nil,},EnvVar{Name:BOND_CNI_PLUGIN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78,ValueFrom:nil,},EnvVar{Name:WHEREABOUTS_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4,ValueFrom:nil,},EnvVar{Name:ROUTE_OVERRRIDE_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa,ValueFrom:nil,},EnvVar{Name:MULTUS_NETWORKPOLICY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:23f833d3738d68706eb2f2868bd76bd71cee016cffa6faf5f045a60cc8c6eddd,ValueFrom:nil,},EnvVar{Name:OVN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,ValueFrom:nil,},EnvVar{Name:OVN_NB_RAFT_ELECTION_TIMER,Value:10,ValueFrom:nil,},EnvVar{Name:OVN_SB_RAFT_ELECTION_TIMER,Value:16,ValueFrom:nil,},EnvVar{Name:OVN_NORTHD_PROBE_INTERVAL,Value:10000,ValueFrom:nil,},EnvVar{Name:OVN_CONTROLLER_INACTIVITY_PROBE,Value:180000,ValueFrom:nil,},EnvVar{Name:OVN_NB_INACTIVITY_PROBE,Value:60000,ValueFrom:nil,},EnvVar{Name:EGRESS_ROUTER_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c,ValueFrom:nil,},EnvVar{Name:NETWORK_METRICS_DAEMON_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d,ValueFrom:nil,},EnvVar{Name:NETWORK_CHECK_SOURCE_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:NETWORK_CHECK_TARGET_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:NETWORK_OPERATOR_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:CLOUD_NETWORK_CONFIG_CONTROLLER_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8048f1cb0be521f09749c0a489503cd56d85b68c6ca93380e082cfd693cd97a8,ValueFrom:nil,},EnvVar{Name:CLI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2,ValueFrom:nil,},EnvVar{Name:FRR_K8S_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5dbf844e49bb46b78586930149e5e5f5dc121014c8afd10fe36f3651967cc256,ValueFrom:nil,},EnvVar{Name:NETWORKING_CONSOLE_PLUGIN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd,ValueFrom:nil,},EnvVar{Name:POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:host-etc-kube,ReadOnly:true,MountPath:/etc/kubernetes,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:metrics-tls,ReadOnly:false,MountPath:/var/run/secrets/serving-cert,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rdwmf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-operator-58b4c7f79c-55gtf_openshift-network-operator(37a5e44f-9a88-4405-be8a-b645485e7312): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 16 00:08:08 crc kubenswrapper[4816]: > logger="UnhandledError" Mar 16 00:08:08 crc kubenswrapper[4816]: E0316 00:08:08.968350 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"network-operator\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" podUID="37a5e44f-9a88-4405-be8a-b645485e7312" Mar 16 00:08:08 crc kubenswrapper[4816]: W0316 00:08:08.969012 4816 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podef543e1b_8068_4ea3_b32a_61027b32e95d.slice/crio-e093c69c398039152d69290c0e1b80d97325393a60e0e47e56ac753af5c622b7 WatchSource:0}: Error finding container e093c69c398039152d69290c0e1b80d97325393a60e0e47e56ac753af5c622b7: Status 404 returned error can't find the container with id e093c69c398039152d69290c0e1b80d97325393a60e0e47e56ac753af5c622b7 Mar 16 00:08:08 crc kubenswrapper[4816]: E0316 00:08:08.975489 4816 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 16 00:08:08 crc kubenswrapper[4816]: container &Container{Name:webhook,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,Command:[/bin/bash -c set -xe Mar 16 00:08:08 crc kubenswrapper[4816]: if [[ -f "/env/_master" ]]; then Mar 16 00:08:08 crc kubenswrapper[4816]: set -o allexport Mar 16 00:08:08 crc kubenswrapper[4816]: source "/env/_master" Mar 16 00:08:08 crc kubenswrapper[4816]: set +o allexport Mar 16 00:08:08 crc kubenswrapper[4816]: fi Mar 16 00:08:08 crc kubenswrapper[4816]: # OVN-K will try to remove hybrid overlay node annotations even when the hybrid overlay is not enabled. Mar 16 00:08:08 crc kubenswrapper[4816]: # https://github.com/ovn-org/ovn-kubernetes/blob/ac6820df0b338a246f10f412cd5ec903bd234694/go-controller/pkg/ovn/master.go#L791 Mar 16 00:08:08 crc kubenswrapper[4816]: ho_enable="--enable-hybrid-overlay" Mar 16 00:08:08 crc kubenswrapper[4816]: echo "I$(date "+%m%d %H:%M:%S.%N") - network-node-identity - start webhook" Mar 16 00:08:08 crc kubenswrapper[4816]: # extra-allowed-user: service account `ovn-kubernetes-control-plane` Mar 16 00:08:08 crc kubenswrapper[4816]: # sets pod annotations in multi-homing layer3 network controller (cluster-manager) Mar 16 00:08:08 crc kubenswrapper[4816]: exec /usr/bin/ovnkube-identity --k8s-apiserver=https://api-int.crc.testing:6443 \ Mar 16 00:08:08 crc kubenswrapper[4816]: --webhook-cert-dir="/etc/webhook-cert" \ Mar 16 00:08:08 crc kubenswrapper[4816]: --webhook-host=127.0.0.1 \ Mar 16 00:08:08 crc kubenswrapper[4816]: --webhook-port=9743 \ Mar 16 00:08:08 crc kubenswrapper[4816]: ${ho_enable} \ Mar 16 00:08:08 crc kubenswrapper[4816]: --enable-interconnect \ Mar 16 00:08:08 crc kubenswrapper[4816]: --disable-approver \ Mar 16 00:08:08 crc kubenswrapper[4816]: --extra-allowed-user="system:serviceaccount:openshift-ovn-kubernetes:ovn-kubernetes-control-plane" \ Mar 16 00:08:08 crc kubenswrapper[4816]: --wait-for-kubernetes-api=200s \ Mar 16 00:08:08 crc kubenswrapper[4816]: --pod-admission-conditions="/var/run/ovnkube-identity-config/additional-pod-admission-cond.json" \ Mar 16 00:08:08 crc kubenswrapper[4816]: --loglevel="${LOGLEVEL}" Mar 16 00:08:08 crc kubenswrapper[4816]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOGLEVEL,Value:2,ValueFrom:nil,},EnvVar{Name:KUBERNETES_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:webhook-cert,ReadOnly:false,MountPath:/etc/webhook-cert/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:env-overrides,ReadOnly:false,MountPath:/env,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovnkube-identity-cm,ReadOnly:false,MountPath:/var/run/ovnkube-identity-config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-s2kz5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000470000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-node-identity-vrzqb_openshift-network-node-identity(ef543e1b-8068-4ea3-b32a-61027b32e95d): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 16 00:08:08 crc kubenswrapper[4816]: > logger="UnhandledError" Mar 16 00:08:08 crc kubenswrapper[4816]: E0316 00:08:08.978181 4816 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 16 00:08:08 crc kubenswrapper[4816]: container &Container{Name:approver,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,Command:[/bin/bash -c set -xe Mar 16 00:08:08 crc kubenswrapper[4816]: if [[ -f "/env/_master" ]]; then Mar 16 00:08:08 crc kubenswrapper[4816]: set -o allexport Mar 16 00:08:08 crc kubenswrapper[4816]: source "/env/_master" Mar 16 00:08:08 crc kubenswrapper[4816]: set +o allexport Mar 16 00:08:08 crc kubenswrapper[4816]: fi Mar 16 00:08:08 crc kubenswrapper[4816]: Mar 16 00:08:08 crc kubenswrapper[4816]: echo "I$(date "+%m%d %H:%M:%S.%N") - network-node-identity - start approver" Mar 16 00:08:08 crc kubenswrapper[4816]: exec /usr/bin/ovnkube-identity --k8s-apiserver=https://api-int.crc.testing:6443 \ Mar 16 00:08:08 crc kubenswrapper[4816]: --disable-webhook \ Mar 16 00:08:08 crc kubenswrapper[4816]: --csr-acceptance-conditions="/var/run/ovnkube-identity-config/additional-cert-acceptance-cond.json" \ Mar 16 00:08:08 crc kubenswrapper[4816]: --loglevel="${LOGLEVEL}" Mar 16 00:08:08 crc kubenswrapper[4816]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOGLEVEL,Value:4,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:env-overrides,ReadOnly:false,MountPath:/env,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovnkube-identity-cm,ReadOnly:false,MountPath:/var/run/ovnkube-identity-config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-s2kz5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000470000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-node-identity-vrzqb_openshift-network-node-identity(ef543e1b-8068-4ea3-b32a-61027b32e95d): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 16 00:08:08 crc kubenswrapper[4816]: > logger="UnhandledError" Mar 16 00:08:08 crc kubenswrapper[4816]: W0316 00:08:08.978328 4816 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd75a4c96_2883_4a0b_bab2_0fab2b6c0b49.slice/crio-af5042ac8c384074c42d42cc06323d63f664527343509a45a3573a02ad94e5b4 WatchSource:0}: Error finding container af5042ac8c384074c42d42cc06323d63f664527343509a45a3573a02ad94e5b4: Status 404 returned error can't find the container with id af5042ac8c384074c42d42cc06323d63f664527343509a45a3573a02ad94e5b4 Mar 16 00:08:08 crc kubenswrapper[4816]: E0316 00:08:08.979452 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"webhook\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\", failed to \"StartContainer\" for \"approver\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"]" pod="openshift-network-node-identity/network-node-identity-vrzqb" podUID="ef543e1b-8068-4ea3-b32a-61027b32e95d" Mar 16 00:08:08 crc kubenswrapper[4816]: E0316 00:08:08.988632 4816 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:iptables-alerter,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2,Command:[/iptables-alerter/iptables-alerter.sh],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONTAINER_RUNTIME_ENDPOINT,Value:unix:///run/crio/crio.sock,ValueFrom:nil,},EnvVar{Name:ALERTER_POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{68157440 0} {} 65Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:iptables-alerter-script,ReadOnly:false,MountPath:/iptables-alerter,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-slash,ReadOnly:true,MountPath:/host,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rczfb,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod iptables-alerter-4ln5h_openshift-network-operator(d75a4c96-2883-4a0b-bab2-0fab2b6c0b49): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars" logger="UnhandledError" Mar 16 00:08:08 crc kubenswrapper[4816]: E0316 00:08:08.990024 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"iptables-alerter\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-network-operator/iptables-alerter-4ln5h" podUID="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" Mar 16 00:08:09 crc kubenswrapper[4816]: I0316 00:08:09.028277 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"7a475f9be4daee3b9f0f7fd48f4450e70f874432d3f4fee48e7d286c66e3de56"} Mar 16 00:08:09 crc kubenswrapper[4816]: I0316 00:08:09.030190 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"af5042ac8c384074c42d42cc06323d63f664527343509a45a3573a02ad94e5b4"} Mar 16 00:08:09 crc kubenswrapper[4816]: I0316 00:08:09.030365 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:09 crc kubenswrapper[4816]: I0316 00:08:09.030405 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:09 crc kubenswrapper[4816]: I0316 00:08:09.030433 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:09 crc kubenswrapper[4816]: I0316 00:08:09.030461 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:09 crc kubenswrapper[4816]: I0316 00:08:09.030480 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:09Z","lastTransitionTime":"2026-03-16T00:08:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:09 crc kubenswrapper[4816]: I0316 00:08:09.031625 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"e093c69c398039152d69290c0e1b80d97325393a60e0e47e56ac753af5c622b7"} Mar 16 00:08:09 crc kubenswrapper[4816]: E0316 00:08:09.036020 4816 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:iptables-alerter,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2,Command:[/iptables-alerter/iptables-alerter.sh],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONTAINER_RUNTIME_ENDPOINT,Value:unix:///run/crio/crio.sock,ValueFrom:nil,},EnvVar{Name:ALERTER_POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{68157440 0} {} 65Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:iptables-alerter-script,ReadOnly:false,MountPath:/iptables-alerter,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-slash,ReadOnly:true,MountPath:/host,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rczfb,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod iptables-alerter-4ln5h_openshift-network-operator(d75a4c96-2883-4a0b-bab2-0fab2b6c0b49): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars" logger="UnhandledError" Mar 16 00:08:09 crc kubenswrapper[4816]: E0316 00:08:09.036698 4816 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 16 00:08:09 crc kubenswrapper[4816]: container &Container{Name:network-operator,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,Command:[/bin/bash -c #!/bin/bash Mar 16 00:08:09 crc kubenswrapper[4816]: set -o allexport Mar 16 00:08:09 crc kubenswrapper[4816]: if [[ -f /etc/kubernetes/apiserver-url.env ]]; then Mar 16 00:08:09 crc kubenswrapper[4816]: source /etc/kubernetes/apiserver-url.env Mar 16 00:08:09 crc kubenswrapper[4816]: else Mar 16 00:08:09 crc kubenswrapper[4816]: echo "Error: /etc/kubernetes/apiserver-url.env is missing" Mar 16 00:08:09 crc kubenswrapper[4816]: exit 1 Mar 16 00:08:09 crc kubenswrapper[4816]: fi Mar 16 00:08:09 crc kubenswrapper[4816]: exec /usr/bin/cluster-network-operator start --listen=0.0.0.0:9104 Mar 16 00:08:09 crc kubenswrapper[4816]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:cno,HostPort:9104,ContainerPort:9104,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:RELEASE_VERSION,Value:4.18.1,ValueFrom:nil,},EnvVar{Name:KUBE_PROXY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b97554198294bf544fbc116c94a0a1fb2ec8a4de0e926bf9d9e320135f0bee6f,ValueFrom:nil,},EnvVar{Name:KUBE_RBAC_PROXY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09,ValueFrom:nil,},EnvVar{Name:MULTUS_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26,ValueFrom:nil,},EnvVar{Name:MULTUS_ADMISSION_CONTROLLER_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317,ValueFrom:nil,},EnvVar{Name:CNI_PLUGINS_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc,ValueFrom:nil,},EnvVar{Name:BOND_CNI_PLUGIN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78,ValueFrom:nil,},EnvVar{Name:WHEREABOUTS_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4,ValueFrom:nil,},EnvVar{Name:ROUTE_OVERRRIDE_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa,ValueFrom:nil,},EnvVar{Name:MULTUS_NETWORKPOLICY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:23f833d3738d68706eb2f2868bd76bd71cee016cffa6faf5f045a60cc8c6eddd,ValueFrom:nil,},EnvVar{Name:OVN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,ValueFrom:nil,},EnvVar{Name:OVN_NB_RAFT_ELECTION_TIMER,Value:10,ValueFrom:nil,},EnvVar{Name:OVN_SB_RAFT_ELECTION_TIMER,Value:16,ValueFrom:nil,},EnvVar{Name:OVN_NORTHD_PROBE_INTERVAL,Value:10000,ValueFrom:nil,},EnvVar{Name:OVN_CONTROLLER_INACTIVITY_PROBE,Value:180000,ValueFrom:nil,},EnvVar{Name:OVN_NB_INACTIVITY_PROBE,Value:60000,ValueFrom:nil,},EnvVar{Name:EGRESS_ROUTER_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c,ValueFrom:nil,},EnvVar{Name:NETWORK_METRICS_DAEMON_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d,ValueFrom:nil,},EnvVar{Name:NETWORK_CHECK_SOURCE_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:NETWORK_CHECK_TARGET_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:NETWORK_OPERATOR_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:CLOUD_NETWORK_CONFIG_CONTROLLER_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8048f1cb0be521f09749c0a489503cd56d85b68c6ca93380e082cfd693cd97a8,ValueFrom:nil,},EnvVar{Name:CLI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2,ValueFrom:nil,},EnvVar{Name:FRR_K8S_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5dbf844e49bb46b78586930149e5e5f5dc121014c8afd10fe36f3651967cc256,ValueFrom:nil,},EnvVar{Name:NETWORKING_CONSOLE_PLUGIN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd,ValueFrom:nil,},EnvVar{Name:POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:host-etc-kube,ReadOnly:true,MountPath:/etc/kubernetes,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:metrics-tls,ReadOnly:false,MountPath:/var/run/secrets/serving-cert,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rdwmf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-operator-58b4c7f79c-55gtf_openshift-network-operator(37a5e44f-9a88-4405-be8a-b645485e7312): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 16 00:08:09 crc kubenswrapper[4816]: > logger="UnhandledError" Mar 16 00:08:09 crc kubenswrapper[4816]: E0316 00:08:09.037196 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"iptables-alerter\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-network-operator/iptables-alerter-4ln5h" podUID="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" Mar 16 00:08:09 crc kubenswrapper[4816]: E0316 00:08:09.037852 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"network-operator\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" podUID="37a5e44f-9a88-4405-be8a-b645485e7312" Mar 16 00:08:09 crc kubenswrapper[4816]: E0316 00:08:09.040903 4816 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 16 00:08:09 crc kubenswrapper[4816]: container &Container{Name:webhook,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,Command:[/bin/bash -c set -xe Mar 16 00:08:09 crc kubenswrapper[4816]: if [[ -f "/env/_master" ]]; then Mar 16 00:08:09 crc kubenswrapper[4816]: set -o allexport Mar 16 00:08:09 crc kubenswrapper[4816]: source "/env/_master" Mar 16 00:08:09 crc kubenswrapper[4816]: set +o allexport Mar 16 00:08:09 crc kubenswrapper[4816]: fi Mar 16 00:08:09 crc kubenswrapper[4816]: # OVN-K will try to remove hybrid overlay node annotations even when the hybrid overlay is not enabled. Mar 16 00:08:09 crc kubenswrapper[4816]: # https://github.com/ovn-org/ovn-kubernetes/blob/ac6820df0b338a246f10f412cd5ec903bd234694/go-controller/pkg/ovn/master.go#L791 Mar 16 00:08:09 crc kubenswrapper[4816]: ho_enable="--enable-hybrid-overlay" Mar 16 00:08:09 crc kubenswrapper[4816]: echo "I$(date "+%m%d %H:%M:%S.%N") - network-node-identity - start webhook" Mar 16 00:08:09 crc kubenswrapper[4816]: # extra-allowed-user: service account `ovn-kubernetes-control-plane` Mar 16 00:08:09 crc kubenswrapper[4816]: # sets pod annotations in multi-homing layer3 network controller (cluster-manager) Mar 16 00:08:09 crc kubenswrapper[4816]: exec /usr/bin/ovnkube-identity --k8s-apiserver=https://api-int.crc.testing:6443 \ Mar 16 00:08:09 crc kubenswrapper[4816]: --webhook-cert-dir="/etc/webhook-cert" \ Mar 16 00:08:09 crc kubenswrapper[4816]: --webhook-host=127.0.0.1 \ Mar 16 00:08:09 crc kubenswrapper[4816]: --webhook-port=9743 \ Mar 16 00:08:09 crc kubenswrapper[4816]: ${ho_enable} \ Mar 16 00:08:09 crc kubenswrapper[4816]: --enable-interconnect \ Mar 16 00:08:09 crc kubenswrapper[4816]: --disable-approver \ Mar 16 00:08:09 crc kubenswrapper[4816]: --extra-allowed-user="system:serviceaccount:openshift-ovn-kubernetes:ovn-kubernetes-control-plane" \ Mar 16 00:08:09 crc kubenswrapper[4816]: --wait-for-kubernetes-api=200s \ Mar 16 00:08:09 crc kubenswrapper[4816]: --pod-admission-conditions="/var/run/ovnkube-identity-config/additional-pod-admission-cond.json" \ Mar 16 00:08:09 crc kubenswrapper[4816]: --loglevel="${LOGLEVEL}" Mar 16 00:08:09 crc kubenswrapper[4816]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOGLEVEL,Value:2,ValueFrom:nil,},EnvVar{Name:KUBERNETES_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:webhook-cert,ReadOnly:false,MountPath:/etc/webhook-cert/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:env-overrides,ReadOnly:false,MountPath:/env,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovnkube-identity-cm,ReadOnly:false,MountPath:/var/run/ovnkube-identity-config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-s2kz5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000470000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-node-identity-vrzqb_openshift-network-node-identity(ef543e1b-8068-4ea3-b32a-61027b32e95d): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 16 00:08:09 crc kubenswrapper[4816]: > logger="UnhandledError" Mar 16 00:08:09 crc kubenswrapper[4816]: E0316 00:08:09.043714 4816 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 16 00:08:09 crc kubenswrapper[4816]: container &Container{Name:approver,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,Command:[/bin/bash -c set -xe Mar 16 00:08:09 crc kubenswrapper[4816]: if [[ -f "/env/_master" ]]; then Mar 16 00:08:09 crc kubenswrapper[4816]: set -o allexport Mar 16 00:08:09 crc kubenswrapper[4816]: source "/env/_master" Mar 16 00:08:09 crc kubenswrapper[4816]: set +o allexport Mar 16 00:08:09 crc kubenswrapper[4816]: fi Mar 16 00:08:09 crc kubenswrapper[4816]: Mar 16 00:08:09 crc kubenswrapper[4816]: echo "I$(date "+%m%d %H:%M:%S.%N") - network-node-identity - start approver" Mar 16 00:08:09 crc kubenswrapper[4816]: exec /usr/bin/ovnkube-identity --k8s-apiserver=https://api-int.crc.testing:6443 \ Mar 16 00:08:09 crc kubenswrapper[4816]: --disable-webhook \ Mar 16 00:08:09 crc kubenswrapper[4816]: --csr-acceptance-conditions="/var/run/ovnkube-identity-config/additional-cert-acceptance-cond.json" \ Mar 16 00:08:09 crc kubenswrapper[4816]: --loglevel="${LOGLEVEL}" Mar 16 00:08:09 crc kubenswrapper[4816]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOGLEVEL,Value:4,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:env-overrides,ReadOnly:false,MountPath:/env,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovnkube-identity-cm,ReadOnly:false,MountPath:/var/run/ovnkube-identity-config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-s2kz5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000470000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-node-identity-vrzqb_openshift-network-node-identity(ef543e1b-8068-4ea3-b32a-61027b32e95d): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 16 00:08:09 crc kubenswrapper[4816]: > logger="UnhandledError" Mar 16 00:08:09 crc kubenswrapper[4816]: E0316 00:08:09.044950 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"webhook\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\", failed to \"StartContainer\" for \"approver\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"]" pod="openshift-network-node-identity/network-node-identity-vrzqb" podUID="ef543e1b-8068-4ea3-b32a-61027b32e95d" Mar 16 00:08:09 crc kubenswrapper[4816]: I0316 00:08:09.045615 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 16 00:08:09 crc kubenswrapper[4816]: I0316 00:08:09.054821 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 16 00:08:09 crc kubenswrapper[4816]: I0316 00:08:09.067077 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 16 00:08:09 crc kubenswrapper[4816]: I0316 00:08:09.078521 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 16 00:08:09 crc kubenswrapper[4816]: I0316 00:08:09.087944 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 16 00:08:09 crc kubenswrapper[4816]: I0316 00:08:09.098767 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 16 00:08:09 crc kubenswrapper[4816]: I0316 00:08:09.111413 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 16 00:08:09 crc kubenswrapper[4816]: I0316 00:08:09.122580 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 16 00:08:09 crc kubenswrapper[4816]: I0316 00:08:09.133173 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 16 00:08:09 crc kubenswrapper[4816]: I0316 00:08:09.133974 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:09 crc kubenswrapper[4816]: I0316 00:08:09.134027 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:09 crc kubenswrapper[4816]: I0316 00:08:09.134046 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:09 crc kubenswrapper[4816]: I0316 00:08:09.134071 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:09 crc kubenswrapper[4816]: I0316 00:08:09.134094 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:09Z","lastTransitionTime":"2026-03-16T00:08:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:09 crc kubenswrapper[4816]: I0316 00:08:09.145919 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 16 00:08:09 crc kubenswrapper[4816]: I0316 00:08:09.160188 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 16 00:08:09 crc kubenswrapper[4816]: I0316 00:08:09.171203 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 16 00:08:09 crc kubenswrapper[4816]: I0316 00:08:09.237523 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:09 crc kubenswrapper[4816]: I0316 00:08:09.237595 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:09 crc kubenswrapper[4816]: I0316 00:08:09.237608 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:09 crc kubenswrapper[4816]: I0316 00:08:09.237631 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:09 crc kubenswrapper[4816]: I0316 00:08:09.237661 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:09Z","lastTransitionTime":"2026-03-16T00:08:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:09 crc kubenswrapper[4816]: I0316 00:08:09.269210 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 16 00:08:09 crc kubenswrapper[4816]: E0316 00:08:09.269389 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-16 00:08:10.269365579 +0000 UTC m=+83.365665532 (durationBeforeRetry 1s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:08:09 crc kubenswrapper[4816]: I0316 00:08:09.339821 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:09 crc kubenswrapper[4816]: I0316 00:08:09.339864 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:09 crc kubenswrapper[4816]: I0316 00:08:09.339874 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:09 crc kubenswrapper[4816]: I0316 00:08:09.339892 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:09 crc kubenswrapper[4816]: I0316 00:08:09.339903 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:09Z","lastTransitionTime":"2026-03-16T00:08:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:09 crc kubenswrapper[4816]: I0316 00:08:09.370595 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 16 00:08:09 crc kubenswrapper[4816]: I0316 00:08:09.370659 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 16 00:08:09 crc kubenswrapper[4816]: I0316 00:08:09.370695 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 16 00:08:09 crc kubenswrapper[4816]: I0316 00:08:09.370718 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 16 00:08:09 crc kubenswrapper[4816]: E0316 00:08:09.370793 4816 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 16 00:08:09 crc kubenswrapper[4816]: E0316 00:08:09.370861 4816 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 16 00:08:09 crc kubenswrapper[4816]: E0316 00:08:09.370876 4816 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 16 00:08:09 crc kubenswrapper[4816]: E0316 00:08:09.370882 4816 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 16 00:08:09 crc kubenswrapper[4816]: E0316 00:08:09.370889 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-16 00:08:10.370867752 +0000 UTC m=+83.467167775 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 16 00:08:09 crc kubenswrapper[4816]: E0316 00:08:09.370895 4816 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 16 00:08:09 crc kubenswrapper[4816]: E0316 00:08:09.370891 4816 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 16 00:08:09 crc kubenswrapper[4816]: E0316 00:08:09.370983 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-16 00:08:10.370963285 +0000 UTC m=+83.467263268 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 16 00:08:09 crc kubenswrapper[4816]: E0316 00:08:09.370902 4816 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 16 00:08:09 crc kubenswrapper[4816]: E0316 00:08:09.371068 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-16 00:08:10.371053368 +0000 UTC m=+83.467353361 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 16 00:08:09 crc kubenswrapper[4816]: E0316 00:08:09.370909 4816 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 16 00:08:09 crc kubenswrapper[4816]: E0316 00:08:09.371121 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-16 00:08:10.37110923 +0000 UTC m=+83.467409223 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 16 00:08:09 crc kubenswrapper[4816]: I0316 00:08:09.443328 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:09 crc kubenswrapper[4816]: I0316 00:08:09.443370 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:09 crc kubenswrapper[4816]: I0316 00:08:09.443382 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:09 crc kubenswrapper[4816]: I0316 00:08:09.443399 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:09 crc kubenswrapper[4816]: I0316 00:08:09.443415 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:09Z","lastTransitionTime":"2026-03-16T00:08:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:09 crc kubenswrapper[4816]: I0316 00:08:09.546235 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:09 crc kubenswrapper[4816]: I0316 00:08:09.546296 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:09 crc kubenswrapper[4816]: I0316 00:08:09.546317 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:09 crc kubenswrapper[4816]: I0316 00:08:09.546339 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:09 crc kubenswrapper[4816]: I0316 00:08:09.546353 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:09Z","lastTransitionTime":"2026-03-16T00:08:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:09 crc kubenswrapper[4816]: I0316 00:08:09.649537 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:09 crc kubenswrapper[4816]: I0316 00:08:09.650004 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:09 crc kubenswrapper[4816]: I0316 00:08:09.650024 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:09 crc kubenswrapper[4816]: I0316 00:08:09.650053 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:09 crc kubenswrapper[4816]: I0316 00:08:09.650075 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:09Z","lastTransitionTime":"2026-03-16T00:08:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:09 crc kubenswrapper[4816]: I0316 00:08:09.671350 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01ab3dd5-8196-46d0-ad33-122e2ca51def" path="/var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes" Mar 16 00:08:09 crc kubenswrapper[4816]: I0316 00:08:09.671874 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" path="/var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes" Mar 16 00:08:09 crc kubenswrapper[4816]: I0316 00:08:09.673124 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09efc573-dbb6-4249-bd59-9b87aba8dd28" path="/var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes" Mar 16 00:08:09 crc kubenswrapper[4816]: I0316 00:08:09.673768 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b574797-001e-440a-8f4e-c0be86edad0f" path="/var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes" Mar 16 00:08:09 crc kubenswrapper[4816]: I0316 00:08:09.674686 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b78653f-4ff9-4508-8672-245ed9b561e3" path="/var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes" Mar 16 00:08:09 crc kubenswrapper[4816]: I0316 00:08:09.675141 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1386a44e-36a2-460c-96d0-0359d2b6f0f5" path="/var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes" Mar 16 00:08:09 crc kubenswrapper[4816]: I0316 00:08:09.675682 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1bf7eb37-55a3-4c65-b768-a94c82151e69" path="/var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes" Mar 16 00:08:09 crc kubenswrapper[4816]: I0316 00:08:09.676526 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d611f23-29be-4491-8495-bee1670e935f" path="/var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes" Mar 16 00:08:09 crc kubenswrapper[4816]: I0316 00:08:09.677138 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20b0d48f-5fd6-431c-a545-e3c800c7b866" path="/var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/volumes" Mar 16 00:08:09 crc kubenswrapper[4816]: I0316 00:08:09.678013 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" path="/var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes" Mar 16 00:08:09 crc kubenswrapper[4816]: I0316 00:08:09.678494 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22c825df-677d-4ca6-82db-3454ed06e783" path="/var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes" Mar 16 00:08:09 crc kubenswrapper[4816]: I0316 00:08:09.679512 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25e176fe-21b4-4974-b1ed-c8b94f112a7f" path="/var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes" Mar 16 00:08:09 crc kubenswrapper[4816]: I0316 00:08:09.679989 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" path="/var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes" Mar 16 00:08:09 crc kubenswrapper[4816]: I0316 00:08:09.680459 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31d8b7a1-420e-4252-a5b7-eebe8a111292" path="/var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes" Mar 16 00:08:09 crc kubenswrapper[4816]: I0316 00:08:09.681278 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ab1a177-2de0-46d9-b765-d0d0649bb42e" path="/var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/volumes" Mar 16 00:08:09 crc kubenswrapper[4816]: I0316 00:08:09.681789 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" path="/var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes" Mar 16 00:08:09 crc kubenswrapper[4816]: I0316 00:08:09.682749 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43509403-f426-496e-be36-56cef71462f5" path="/var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes" Mar 16 00:08:09 crc kubenswrapper[4816]: I0316 00:08:09.683101 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44663579-783b-4372-86d6-acf235a62d72" path="/var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/volumes" Mar 16 00:08:09 crc kubenswrapper[4816]: I0316 00:08:09.683742 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="496e6271-fb68-4057-954e-a0d97a4afa3f" path="/var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes" Mar 16 00:08:09 crc kubenswrapper[4816]: I0316 00:08:09.684645 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" path="/var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes" Mar 16 00:08:09 crc kubenswrapper[4816]: I0316 00:08:09.685102 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49ef4625-1d3a-4a9f-b595-c2433d32326d" path="/var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/volumes" Mar 16 00:08:09 crc kubenswrapper[4816]: I0316 00:08:09.686114 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4bb40260-dbaa-4fb0-84df-5e680505d512" path="/var/lib/kubelet/pods/4bb40260-dbaa-4fb0-84df-5e680505d512/volumes" Mar 16 00:08:09 crc kubenswrapper[4816]: I0316 00:08:09.686535 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5225d0e4-402f-4861-b410-819f433b1803" path="/var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes" Mar 16 00:08:09 crc kubenswrapper[4816]: I0316 00:08:09.687537 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5441d097-087c-4d9a-baa8-b210afa90fc9" path="/var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes" Mar 16 00:08:09 crc kubenswrapper[4816]: I0316 00:08:09.688133 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57a731c4-ef35-47a8-b875-bfb08a7f8011" path="/var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes" Mar 16 00:08:09 crc kubenswrapper[4816]: I0316 00:08:09.688778 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b88f790-22fa-440e-b583-365168c0b23d" path="/var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/volumes" Mar 16 00:08:09 crc kubenswrapper[4816]: I0316 00:08:09.689840 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5fe579f8-e8a6-4643-bce5-a661393c4dde" path="/var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/volumes" Mar 16 00:08:09 crc kubenswrapper[4816]: I0316 00:08:09.690279 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6402fda4-df10-493c-b4e5-d0569419652d" path="/var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes" Mar 16 00:08:09 crc kubenswrapper[4816]: I0316 00:08:09.691153 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6509e943-70c6-444c-bc41-48a544e36fbd" path="/var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes" Mar 16 00:08:09 crc kubenswrapper[4816]: I0316 00:08:09.691596 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6731426b-95fe-49ff-bb5f-40441049fde2" path="/var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/volumes" Mar 16 00:08:09 crc kubenswrapper[4816]: I0316 00:08:09.692368 4816 kubelet_volumes.go:152] "Cleaned up orphaned volume subpath from pod" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volume-subpaths/run-systemd/ovnkube-controller/6" Mar 16 00:08:09 crc kubenswrapper[4816]: I0316 00:08:09.692465 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volumes" Mar 16 00:08:09 crc kubenswrapper[4816]: I0316 00:08:09.693959 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7539238d-5fe0-46ed-884e-1c3b566537ec" path="/var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes" Mar 16 00:08:09 crc kubenswrapper[4816]: I0316 00:08:09.694823 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7583ce53-e0fe-4a16-9e4d-50516596a136" path="/var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes" Mar 16 00:08:09 crc kubenswrapper[4816]: I0316 00:08:09.695297 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7bb08738-c794-4ee8-9972-3a62ca171029" path="/var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes" Mar 16 00:08:09 crc kubenswrapper[4816]: I0316 00:08:09.696785 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87cf06ed-a83f-41a7-828d-70653580a8cb" path="/var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes" Mar 16 00:08:09 crc kubenswrapper[4816]: I0316 00:08:09.697373 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" path="/var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes" Mar 16 00:08:09 crc kubenswrapper[4816]: I0316 00:08:09.698192 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="925f1c65-6136-48ba-85aa-3a3b50560753" path="/var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes" Mar 16 00:08:09 crc kubenswrapper[4816]: I0316 00:08:09.698863 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" path="/var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/volumes" Mar 16 00:08:09 crc kubenswrapper[4816]: I0316 00:08:09.699893 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d4552c7-cd75-42dd-8880-30dd377c49a4" path="/var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes" Mar 16 00:08:09 crc kubenswrapper[4816]: I0316 00:08:09.700322 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" path="/var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/volumes" Mar 16 00:08:09 crc kubenswrapper[4816]: I0316 00:08:09.701226 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a31745f5-9847-4afe-82a5-3161cc66ca93" path="/var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes" Mar 16 00:08:09 crc kubenswrapper[4816]: I0316 00:08:09.701801 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" path="/var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes" Mar 16 00:08:09 crc kubenswrapper[4816]: I0316 00:08:09.702726 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6312bbd-5731-4ea0-a20f-81d5a57df44a" path="/var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/volumes" Mar 16 00:08:09 crc kubenswrapper[4816]: I0316 00:08:09.703176 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" path="/var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes" Mar 16 00:08:09 crc kubenswrapper[4816]: I0316 00:08:09.704127 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" path="/var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes" Mar 16 00:08:09 crc kubenswrapper[4816]: I0316 00:08:09.704625 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" path="/var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/volumes" Mar 16 00:08:09 crc kubenswrapper[4816]: I0316 00:08:09.705657 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf126b07-da06-4140-9a57-dfd54fc6b486" path="/var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes" Mar 16 00:08:09 crc kubenswrapper[4816]: I0316 00:08:09.706112 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c03ee662-fb2f-4fc4-a2c1-af487c19d254" path="/var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes" Mar 16 00:08:09 crc kubenswrapper[4816]: I0316 00:08:09.706906 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" path="/var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/volumes" Mar 16 00:08:09 crc kubenswrapper[4816]: I0316 00:08:09.707356 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7e6199b-1264-4501-8953-767f51328d08" path="/var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes" Mar 16 00:08:09 crc kubenswrapper[4816]: I0316 00:08:09.708198 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="efdd0498-1daa-4136-9a4a-3b948c2293fc" path="/var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/volumes" Mar 16 00:08:09 crc kubenswrapper[4816]: I0316 00:08:09.708749 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" path="/var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/volumes" Mar 16 00:08:09 crc kubenswrapper[4816]: I0316 00:08:09.709188 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fda69060-fa79-4696-b1a6-7980f124bf7c" path="/var/lib/kubelet/pods/fda69060-fa79-4696-b1a6-7980f124bf7c/volumes" Mar 16 00:08:09 crc kubenswrapper[4816]: I0316 00:08:09.752198 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:09 crc kubenswrapper[4816]: I0316 00:08:09.752245 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:09 crc kubenswrapper[4816]: I0316 00:08:09.752256 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:09 crc kubenswrapper[4816]: I0316 00:08:09.752274 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:09 crc kubenswrapper[4816]: I0316 00:08:09.752285 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:09Z","lastTransitionTime":"2026-03-16T00:08:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:09 crc kubenswrapper[4816]: I0316 00:08:09.854425 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:09 crc kubenswrapper[4816]: I0316 00:08:09.854491 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:09 crc kubenswrapper[4816]: I0316 00:08:09.854512 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:09 crc kubenswrapper[4816]: I0316 00:08:09.854542 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:09 crc kubenswrapper[4816]: I0316 00:08:09.854597 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:09Z","lastTransitionTime":"2026-03-16T00:08:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:09 crc kubenswrapper[4816]: I0316 00:08:09.957231 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:09 crc kubenswrapper[4816]: I0316 00:08:09.957297 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:09 crc kubenswrapper[4816]: I0316 00:08:09.957319 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:09 crc kubenswrapper[4816]: I0316 00:08:09.957352 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:09 crc kubenswrapper[4816]: I0316 00:08:09.957373 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:09Z","lastTransitionTime":"2026-03-16T00:08:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:10 crc kubenswrapper[4816]: I0316 00:08:10.060055 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:10 crc kubenswrapper[4816]: I0316 00:08:10.060141 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:10 crc kubenswrapper[4816]: I0316 00:08:10.060169 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:10 crc kubenswrapper[4816]: I0316 00:08:10.060218 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:10 crc kubenswrapper[4816]: I0316 00:08:10.060242 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:10Z","lastTransitionTime":"2026-03-16T00:08:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:10 crc kubenswrapper[4816]: I0316 00:08:10.163048 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:10 crc kubenswrapper[4816]: I0316 00:08:10.163124 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:10 crc kubenswrapper[4816]: I0316 00:08:10.163151 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:10 crc kubenswrapper[4816]: I0316 00:08:10.163180 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:10 crc kubenswrapper[4816]: I0316 00:08:10.163203 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:10Z","lastTransitionTime":"2026-03-16T00:08:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:10 crc kubenswrapper[4816]: I0316 00:08:10.266367 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:10 crc kubenswrapper[4816]: I0316 00:08:10.266432 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:10 crc kubenswrapper[4816]: I0316 00:08:10.266451 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:10 crc kubenswrapper[4816]: I0316 00:08:10.266477 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:10 crc kubenswrapper[4816]: I0316 00:08:10.266495 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:10Z","lastTransitionTime":"2026-03-16T00:08:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:10 crc kubenswrapper[4816]: I0316 00:08:10.279774 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 16 00:08:10 crc kubenswrapper[4816]: E0316 00:08:10.279937 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-16 00:08:12.279914117 +0000 UTC m=+85.376214070 (durationBeforeRetry 2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:08:10 crc kubenswrapper[4816]: I0316 00:08:10.368897 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:10 crc kubenswrapper[4816]: I0316 00:08:10.368962 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:10 crc kubenswrapper[4816]: I0316 00:08:10.368979 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:10 crc kubenswrapper[4816]: I0316 00:08:10.369007 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:10 crc kubenswrapper[4816]: I0316 00:08:10.369026 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:10Z","lastTransitionTime":"2026-03-16T00:08:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:10 crc kubenswrapper[4816]: I0316 00:08:10.380739 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 16 00:08:10 crc kubenswrapper[4816]: I0316 00:08:10.380808 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 16 00:08:10 crc kubenswrapper[4816]: I0316 00:08:10.380859 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 16 00:08:10 crc kubenswrapper[4816]: E0316 00:08:10.380875 4816 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 16 00:08:10 crc kubenswrapper[4816]: I0316 00:08:10.380949 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 16 00:08:10 crc kubenswrapper[4816]: E0316 00:08:10.380976 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-16 00:08:12.380950314 +0000 UTC m=+85.477250297 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 16 00:08:10 crc kubenswrapper[4816]: E0316 00:08:10.381062 4816 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 16 00:08:10 crc kubenswrapper[4816]: E0316 00:08:10.381152 4816 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 16 00:08:10 crc kubenswrapper[4816]: E0316 00:08:10.381197 4816 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 16 00:08:10 crc kubenswrapper[4816]: E0316 00:08:10.381224 4816 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 16 00:08:10 crc kubenswrapper[4816]: E0316 00:08:10.381200 4816 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 16 00:08:10 crc kubenswrapper[4816]: E0316 00:08:10.381167 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-16 00:08:12.381146421 +0000 UTC m=+85.477446414 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 16 00:08:10 crc kubenswrapper[4816]: E0316 00:08:10.381314 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-16 00:08:12.381299006 +0000 UTC m=+85.477598999 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 16 00:08:10 crc kubenswrapper[4816]: E0316 00:08:10.381278 4816 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 16 00:08:10 crc kubenswrapper[4816]: E0316 00:08:10.381349 4816 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 16 00:08:10 crc kubenswrapper[4816]: E0316 00:08:10.381418 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-16 00:08:12.381394409 +0000 UTC m=+85.477694402 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 16 00:08:10 crc kubenswrapper[4816]: I0316 00:08:10.471437 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:10 crc kubenswrapper[4816]: I0316 00:08:10.471528 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:10 crc kubenswrapper[4816]: I0316 00:08:10.471569 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:10 crc kubenswrapper[4816]: I0316 00:08:10.471595 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:10 crc kubenswrapper[4816]: I0316 00:08:10.471615 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:10Z","lastTransitionTime":"2026-03-16T00:08:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:10 crc kubenswrapper[4816]: I0316 00:08:10.573905 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:10 crc kubenswrapper[4816]: I0316 00:08:10.573982 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:10 crc kubenswrapper[4816]: I0316 00:08:10.573991 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:10 crc kubenswrapper[4816]: I0316 00:08:10.574005 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:10 crc kubenswrapper[4816]: I0316 00:08:10.574028 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:10Z","lastTransitionTime":"2026-03-16T00:08:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:10 crc kubenswrapper[4816]: I0316 00:08:10.666995 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 16 00:08:10 crc kubenswrapper[4816]: I0316 00:08:10.667057 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 16 00:08:10 crc kubenswrapper[4816]: I0316 00:08:10.667013 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 16 00:08:10 crc kubenswrapper[4816]: E0316 00:08:10.667314 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 16 00:08:10 crc kubenswrapper[4816]: E0316 00:08:10.667719 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 16 00:08:10 crc kubenswrapper[4816]: E0316 00:08:10.667600 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 16 00:08:10 crc kubenswrapper[4816]: I0316 00:08:10.676587 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:10 crc kubenswrapper[4816]: I0316 00:08:10.676628 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:10 crc kubenswrapper[4816]: I0316 00:08:10.676638 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:10 crc kubenswrapper[4816]: I0316 00:08:10.676653 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:10 crc kubenswrapper[4816]: I0316 00:08:10.676663 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:10Z","lastTransitionTime":"2026-03-16T00:08:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:10 crc kubenswrapper[4816]: I0316 00:08:10.780093 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:10 crc kubenswrapper[4816]: I0316 00:08:10.780162 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:10 crc kubenswrapper[4816]: I0316 00:08:10.780180 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:10 crc kubenswrapper[4816]: I0316 00:08:10.780207 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:10 crc kubenswrapper[4816]: I0316 00:08:10.780225 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:10Z","lastTransitionTime":"2026-03-16T00:08:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:10 crc kubenswrapper[4816]: I0316 00:08:10.882181 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:10 crc kubenswrapper[4816]: I0316 00:08:10.882218 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:10 crc kubenswrapper[4816]: I0316 00:08:10.882228 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:10 crc kubenswrapper[4816]: I0316 00:08:10.882247 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:10 crc kubenswrapper[4816]: I0316 00:08:10.882259 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:10Z","lastTransitionTime":"2026-03-16T00:08:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:10 crc kubenswrapper[4816]: I0316 00:08:10.992918 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:10 crc kubenswrapper[4816]: I0316 00:08:10.992990 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:10 crc kubenswrapper[4816]: I0316 00:08:10.993008 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:10 crc kubenswrapper[4816]: I0316 00:08:10.993032 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:10 crc kubenswrapper[4816]: I0316 00:08:10.993057 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:10Z","lastTransitionTime":"2026-03-16T00:08:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:11 crc kubenswrapper[4816]: I0316 00:08:11.096147 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:11 crc kubenswrapper[4816]: I0316 00:08:11.096227 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:11 crc kubenswrapper[4816]: I0316 00:08:11.096248 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:11 crc kubenswrapper[4816]: I0316 00:08:11.096272 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:11 crc kubenswrapper[4816]: I0316 00:08:11.096290 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:11Z","lastTransitionTime":"2026-03-16T00:08:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:11 crc kubenswrapper[4816]: I0316 00:08:11.199590 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:11 crc kubenswrapper[4816]: I0316 00:08:11.199653 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:11 crc kubenswrapper[4816]: I0316 00:08:11.199675 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:11 crc kubenswrapper[4816]: I0316 00:08:11.199705 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:11 crc kubenswrapper[4816]: I0316 00:08:11.199727 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:11Z","lastTransitionTime":"2026-03-16T00:08:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:11 crc kubenswrapper[4816]: I0316 00:08:11.302950 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:11 crc kubenswrapper[4816]: I0316 00:08:11.303016 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:11 crc kubenswrapper[4816]: I0316 00:08:11.303039 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:11 crc kubenswrapper[4816]: I0316 00:08:11.303072 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:11 crc kubenswrapper[4816]: I0316 00:08:11.303092 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:11Z","lastTransitionTime":"2026-03-16T00:08:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:11 crc kubenswrapper[4816]: I0316 00:08:11.405991 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:11 crc kubenswrapper[4816]: I0316 00:08:11.406044 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:11 crc kubenswrapper[4816]: I0316 00:08:11.406062 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:11 crc kubenswrapper[4816]: I0316 00:08:11.406085 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:11 crc kubenswrapper[4816]: I0316 00:08:11.406104 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:11Z","lastTransitionTime":"2026-03-16T00:08:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:11 crc kubenswrapper[4816]: I0316 00:08:11.509047 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:11 crc kubenswrapper[4816]: I0316 00:08:11.509112 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:11 crc kubenswrapper[4816]: I0316 00:08:11.509134 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:11 crc kubenswrapper[4816]: I0316 00:08:11.509163 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:11 crc kubenswrapper[4816]: I0316 00:08:11.509185 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:11Z","lastTransitionTime":"2026-03-16T00:08:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:11 crc kubenswrapper[4816]: I0316 00:08:11.611845 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:11 crc kubenswrapper[4816]: I0316 00:08:11.611894 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:11 crc kubenswrapper[4816]: I0316 00:08:11.611910 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:11 crc kubenswrapper[4816]: I0316 00:08:11.611935 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:11 crc kubenswrapper[4816]: I0316 00:08:11.611951 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:11Z","lastTransitionTime":"2026-03-16T00:08:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:11 crc kubenswrapper[4816]: I0316 00:08:11.714581 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:11 crc kubenswrapper[4816]: I0316 00:08:11.714669 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:11 crc kubenswrapper[4816]: I0316 00:08:11.714687 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:11 crc kubenswrapper[4816]: I0316 00:08:11.714713 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:11 crc kubenswrapper[4816]: I0316 00:08:11.714729 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:11Z","lastTransitionTime":"2026-03-16T00:08:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:11 crc kubenswrapper[4816]: I0316 00:08:11.817636 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:11 crc kubenswrapper[4816]: I0316 00:08:11.818071 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:11 crc kubenswrapper[4816]: I0316 00:08:11.818258 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:11 crc kubenswrapper[4816]: I0316 00:08:11.818459 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:11 crc kubenswrapper[4816]: I0316 00:08:11.818669 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:11Z","lastTransitionTime":"2026-03-16T00:08:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:11 crc kubenswrapper[4816]: I0316 00:08:11.921966 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:11 crc kubenswrapper[4816]: I0316 00:08:11.922040 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:11 crc kubenswrapper[4816]: I0316 00:08:11.922060 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:11 crc kubenswrapper[4816]: I0316 00:08:11.922117 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:11 crc kubenswrapper[4816]: I0316 00:08:11.922140 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:11Z","lastTransitionTime":"2026-03-16T00:08:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:12 crc kubenswrapper[4816]: I0316 00:08:12.025212 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:12 crc kubenswrapper[4816]: I0316 00:08:12.025267 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:12 crc kubenswrapper[4816]: I0316 00:08:12.025309 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:12 crc kubenswrapper[4816]: I0316 00:08:12.025335 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:12 crc kubenswrapper[4816]: I0316 00:08:12.025352 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:12Z","lastTransitionTime":"2026-03-16T00:08:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:12 crc kubenswrapper[4816]: I0316 00:08:12.128017 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:12 crc kubenswrapper[4816]: I0316 00:08:12.128086 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:12 crc kubenswrapper[4816]: I0316 00:08:12.128107 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:12 crc kubenswrapper[4816]: I0316 00:08:12.128132 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:12 crc kubenswrapper[4816]: I0316 00:08:12.128152 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:12Z","lastTransitionTime":"2026-03-16T00:08:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:12 crc kubenswrapper[4816]: I0316 00:08:12.230146 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:12 crc kubenswrapper[4816]: I0316 00:08:12.230199 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:12 crc kubenswrapper[4816]: I0316 00:08:12.230215 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:12 crc kubenswrapper[4816]: I0316 00:08:12.230243 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:12 crc kubenswrapper[4816]: I0316 00:08:12.230260 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:12Z","lastTransitionTime":"2026-03-16T00:08:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:12 crc kubenswrapper[4816]: I0316 00:08:12.298971 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 16 00:08:12 crc kubenswrapper[4816]: E0316 00:08:12.299333 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-16 00:08:16.299131131 +0000 UTC m=+89.395431084 (durationBeforeRetry 4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:08:12 crc kubenswrapper[4816]: I0316 00:08:12.332817 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:12 crc kubenswrapper[4816]: I0316 00:08:12.332848 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:12 crc kubenswrapper[4816]: I0316 00:08:12.332860 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:12 crc kubenswrapper[4816]: I0316 00:08:12.332877 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:12 crc kubenswrapper[4816]: I0316 00:08:12.332889 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:12Z","lastTransitionTime":"2026-03-16T00:08:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:12 crc kubenswrapper[4816]: I0316 00:08:12.400261 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 16 00:08:12 crc kubenswrapper[4816]: I0316 00:08:12.400349 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 16 00:08:12 crc kubenswrapper[4816]: I0316 00:08:12.400391 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 16 00:08:12 crc kubenswrapper[4816]: I0316 00:08:12.400427 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 16 00:08:12 crc kubenswrapper[4816]: E0316 00:08:12.400479 4816 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 16 00:08:12 crc kubenswrapper[4816]: E0316 00:08:12.400533 4816 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 16 00:08:12 crc kubenswrapper[4816]: E0316 00:08:12.400584 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-16 00:08:16.400562901 +0000 UTC m=+89.496862854 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 16 00:08:12 crc kubenswrapper[4816]: E0316 00:08:12.400608 4816 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 16 00:08:12 crc kubenswrapper[4816]: E0316 00:08:12.400640 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-16 00:08:16.400618373 +0000 UTC m=+89.496918416 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 16 00:08:12 crc kubenswrapper[4816]: E0316 00:08:12.400651 4816 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 16 00:08:12 crc kubenswrapper[4816]: E0316 00:08:12.400658 4816 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 16 00:08:12 crc kubenswrapper[4816]: E0316 00:08:12.400696 4816 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 16 00:08:12 crc kubenswrapper[4816]: E0316 00:08:12.400719 4816 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 16 00:08:12 crc kubenswrapper[4816]: E0316 00:08:12.400671 4816 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 16 00:08:12 crc kubenswrapper[4816]: E0316 00:08:12.400774 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-16 00:08:16.400756348 +0000 UTC m=+89.497056351 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 16 00:08:12 crc kubenswrapper[4816]: E0316 00:08:12.400889 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-16 00:08:16.400846651 +0000 UTC m=+89.497146634 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 16 00:08:12 crc kubenswrapper[4816]: I0316 00:08:12.436075 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:12 crc kubenswrapper[4816]: I0316 00:08:12.436139 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:12 crc kubenswrapper[4816]: I0316 00:08:12.436157 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:12 crc kubenswrapper[4816]: I0316 00:08:12.436181 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:12 crc kubenswrapper[4816]: I0316 00:08:12.436199 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:12Z","lastTransitionTime":"2026-03-16T00:08:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:12 crc kubenswrapper[4816]: I0316 00:08:12.538965 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:12 crc kubenswrapper[4816]: I0316 00:08:12.539027 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:12 crc kubenswrapper[4816]: I0316 00:08:12.539045 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:12 crc kubenswrapper[4816]: I0316 00:08:12.539072 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:12 crc kubenswrapper[4816]: I0316 00:08:12.539090 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:12Z","lastTransitionTime":"2026-03-16T00:08:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:12 crc kubenswrapper[4816]: I0316 00:08:12.642575 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:12 crc kubenswrapper[4816]: I0316 00:08:12.642616 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:12 crc kubenswrapper[4816]: I0316 00:08:12.642627 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:12 crc kubenswrapper[4816]: I0316 00:08:12.642645 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:12 crc kubenswrapper[4816]: I0316 00:08:12.642658 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:12Z","lastTransitionTime":"2026-03-16T00:08:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:12 crc kubenswrapper[4816]: I0316 00:08:12.667303 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 16 00:08:12 crc kubenswrapper[4816]: I0316 00:08:12.667338 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 16 00:08:12 crc kubenswrapper[4816]: E0316 00:08:12.667455 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 16 00:08:12 crc kubenswrapper[4816]: I0316 00:08:12.667312 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 16 00:08:12 crc kubenswrapper[4816]: E0316 00:08:12.667661 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 16 00:08:12 crc kubenswrapper[4816]: E0316 00:08:12.667750 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 16 00:08:12 crc kubenswrapper[4816]: I0316 00:08:12.745516 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:12 crc kubenswrapper[4816]: I0316 00:08:12.745578 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:12 crc kubenswrapper[4816]: I0316 00:08:12.745589 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:12 crc kubenswrapper[4816]: I0316 00:08:12.745607 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:12 crc kubenswrapper[4816]: I0316 00:08:12.745618 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:12Z","lastTransitionTime":"2026-03-16T00:08:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:12 crc kubenswrapper[4816]: I0316 00:08:12.848772 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:12 crc kubenswrapper[4816]: I0316 00:08:12.848817 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:12 crc kubenswrapper[4816]: I0316 00:08:12.848827 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:12 crc kubenswrapper[4816]: I0316 00:08:12.848845 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:12 crc kubenswrapper[4816]: I0316 00:08:12.848857 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:12Z","lastTransitionTime":"2026-03-16T00:08:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:12 crc kubenswrapper[4816]: I0316 00:08:12.951492 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:12 crc kubenswrapper[4816]: I0316 00:08:12.951542 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:12 crc kubenswrapper[4816]: I0316 00:08:12.951565 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:12 crc kubenswrapper[4816]: I0316 00:08:12.951582 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:12 crc kubenswrapper[4816]: I0316 00:08:12.951596 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:12Z","lastTransitionTime":"2026-03-16T00:08:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:13 crc kubenswrapper[4816]: I0316 00:08:13.054083 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:13 crc kubenswrapper[4816]: I0316 00:08:13.054130 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:13 crc kubenswrapper[4816]: I0316 00:08:13.054141 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:13 crc kubenswrapper[4816]: I0316 00:08:13.054158 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:13 crc kubenswrapper[4816]: I0316 00:08:13.054171 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:13Z","lastTransitionTime":"2026-03-16T00:08:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:13 crc kubenswrapper[4816]: I0316 00:08:13.156845 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:13 crc kubenswrapper[4816]: I0316 00:08:13.157132 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:13 crc kubenswrapper[4816]: I0316 00:08:13.157152 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:13 crc kubenswrapper[4816]: I0316 00:08:13.157175 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:13 crc kubenswrapper[4816]: I0316 00:08:13.157190 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:13Z","lastTransitionTime":"2026-03-16T00:08:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:13 crc kubenswrapper[4816]: I0316 00:08:13.260602 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:13 crc kubenswrapper[4816]: I0316 00:08:13.260680 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:13 crc kubenswrapper[4816]: I0316 00:08:13.260734 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:13 crc kubenswrapper[4816]: I0316 00:08:13.260768 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:13 crc kubenswrapper[4816]: I0316 00:08:13.260789 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:13Z","lastTransitionTime":"2026-03-16T00:08:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:13 crc kubenswrapper[4816]: I0316 00:08:13.363744 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:13 crc kubenswrapper[4816]: I0316 00:08:13.363811 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:13 crc kubenswrapper[4816]: I0316 00:08:13.363834 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:13 crc kubenswrapper[4816]: I0316 00:08:13.363869 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:13 crc kubenswrapper[4816]: I0316 00:08:13.363907 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:13Z","lastTransitionTime":"2026-03-16T00:08:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:13 crc kubenswrapper[4816]: I0316 00:08:13.466741 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:13 crc kubenswrapper[4816]: I0316 00:08:13.466821 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:13 crc kubenswrapper[4816]: I0316 00:08:13.466843 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:13 crc kubenswrapper[4816]: I0316 00:08:13.466874 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:13 crc kubenswrapper[4816]: I0316 00:08:13.466894 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:13Z","lastTransitionTime":"2026-03-16T00:08:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:13 crc kubenswrapper[4816]: I0316 00:08:13.569942 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:13 crc kubenswrapper[4816]: I0316 00:08:13.570008 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:13 crc kubenswrapper[4816]: I0316 00:08:13.570027 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:13 crc kubenswrapper[4816]: I0316 00:08:13.570054 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:13 crc kubenswrapper[4816]: I0316 00:08:13.570071 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:13Z","lastTransitionTime":"2026-03-16T00:08:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:13 crc kubenswrapper[4816]: I0316 00:08:13.673348 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:13 crc kubenswrapper[4816]: I0316 00:08:13.673414 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:13 crc kubenswrapper[4816]: I0316 00:08:13.673440 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:13 crc kubenswrapper[4816]: I0316 00:08:13.673471 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:13 crc kubenswrapper[4816]: I0316 00:08:13.673498 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:13Z","lastTransitionTime":"2026-03-16T00:08:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:13 crc kubenswrapper[4816]: I0316 00:08:13.776694 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:13 crc kubenswrapper[4816]: I0316 00:08:13.776750 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:13 crc kubenswrapper[4816]: I0316 00:08:13.776761 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:13 crc kubenswrapper[4816]: I0316 00:08:13.776781 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:13 crc kubenswrapper[4816]: I0316 00:08:13.776792 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:13Z","lastTransitionTime":"2026-03-16T00:08:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:13 crc kubenswrapper[4816]: I0316 00:08:13.879838 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:13 crc kubenswrapper[4816]: I0316 00:08:13.879917 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:13 crc kubenswrapper[4816]: I0316 00:08:13.879931 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:13 crc kubenswrapper[4816]: I0316 00:08:13.879955 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:13 crc kubenswrapper[4816]: I0316 00:08:13.879969 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:13Z","lastTransitionTime":"2026-03-16T00:08:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:13 crc kubenswrapper[4816]: I0316 00:08:13.982053 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:13 crc kubenswrapper[4816]: I0316 00:08:13.982097 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:13 crc kubenswrapper[4816]: I0316 00:08:13.982106 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:13 crc kubenswrapper[4816]: I0316 00:08:13.982125 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:13 crc kubenswrapper[4816]: I0316 00:08:13.982136 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:13Z","lastTransitionTime":"2026-03-16T00:08:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:14 crc kubenswrapper[4816]: I0316 00:08:14.085034 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:14 crc kubenswrapper[4816]: I0316 00:08:14.085592 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:14 crc kubenswrapper[4816]: I0316 00:08:14.085677 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:14 crc kubenswrapper[4816]: I0316 00:08:14.085746 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:14 crc kubenswrapper[4816]: I0316 00:08:14.085807 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:14Z","lastTransitionTime":"2026-03-16T00:08:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:14 crc kubenswrapper[4816]: I0316 00:08:14.189749 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:14 crc kubenswrapper[4816]: I0316 00:08:14.190245 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:14 crc kubenswrapper[4816]: I0316 00:08:14.190326 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:14 crc kubenswrapper[4816]: I0316 00:08:14.190405 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:14 crc kubenswrapper[4816]: I0316 00:08:14.190484 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:14Z","lastTransitionTime":"2026-03-16T00:08:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:14 crc kubenswrapper[4816]: I0316 00:08:14.293650 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:14 crc kubenswrapper[4816]: I0316 00:08:14.293718 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:14 crc kubenswrapper[4816]: I0316 00:08:14.293746 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:14 crc kubenswrapper[4816]: I0316 00:08:14.293776 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:14 crc kubenswrapper[4816]: I0316 00:08:14.293800 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:14Z","lastTransitionTime":"2026-03-16T00:08:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:14 crc kubenswrapper[4816]: I0316 00:08:14.396594 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:14 crc kubenswrapper[4816]: I0316 00:08:14.396648 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:14 crc kubenswrapper[4816]: I0316 00:08:14.396695 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:14 crc kubenswrapper[4816]: I0316 00:08:14.396720 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:14 crc kubenswrapper[4816]: I0316 00:08:14.396738 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:14Z","lastTransitionTime":"2026-03-16T00:08:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:14 crc kubenswrapper[4816]: I0316 00:08:14.499531 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:14 crc kubenswrapper[4816]: I0316 00:08:14.499628 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:14 crc kubenswrapper[4816]: I0316 00:08:14.499653 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:14 crc kubenswrapper[4816]: I0316 00:08:14.499683 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:14 crc kubenswrapper[4816]: I0316 00:08:14.499704 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:14Z","lastTransitionTime":"2026-03-16T00:08:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:14 crc kubenswrapper[4816]: I0316 00:08:14.603091 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:14 crc kubenswrapper[4816]: I0316 00:08:14.603145 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:14 crc kubenswrapper[4816]: I0316 00:08:14.603161 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:14 crc kubenswrapper[4816]: I0316 00:08:14.603184 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:14 crc kubenswrapper[4816]: I0316 00:08:14.603202 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:14Z","lastTransitionTime":"2026-03-16T00:08:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:14 crc kubenswrapper[4816]: I0316 00:08:14.667411 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 16 00:08:14 crc kubenswrapper[4816]: I0316 00:08:14.667455 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 16 00:08:14 crc kubenswrapper[4816]: I0316 00:08:14.667468 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 16 00:08:14 crc kubenswrapper[4816]: E0316 00:08:14.667634 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 16 00:08:14 crc kubenswrapper[4816]: E0316 00:08:14.667760 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 16 00:08:14 crc kubenswrapper[4816]: E0316 00:08:14.667881 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 16 00:08:14 crc kubenswrapper[4816]: I0316 00:08:14.705672 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:14 crc kubenswrapper[4816]: I0316 00:08:14.705737 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:14 crc kubenswrapper[4816]: I0316 00:08:14.705758 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:14 crc kubenswrapper[4816]: I0316 00:08:14.705789 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:14 crc kubenswrapper[4816]: I0316 00:08:14.705810 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:14Z","lastTransitionTime":"2026-03-16T00:08:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:14 crc kubenswrapper[4816]: I0316 00:08:14.808950 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:14 crc kubenswrapper[4816]: I0316 00:08:14.809077 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:14 crc kubenswrapper[4816]: I0316 00:08:14.809097 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:14 crc kubenswrapper[4816]: I0316 00:08:14.809123 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:14 crc kubenswrapper[4816]: I0316 00:08:14.809141 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:14Z","lastTransitionTime":"2026-03-16T00:08:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:14 crc kubenswrapper[4816]: I0316 00:08:14.913042 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:14 crc kubenswrapper[4816]: I0316 00:08:14.913121 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:14 crc kubenswrapper[4816]: I0316 00:08:14.913143 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:14 crc kubenswrapper[4816]: I0316 00:08:14.913174 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:14 crc kubenswrapper[4816]: I0316 00:08:14.913195 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:14Z","lastTransitionTime":"2026-03-16T00:08:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:15 crc kubenswrapper[4816]: I0316 00:08:15.016204 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:15 crc kubenswrapper[4816]: I0316 00:08:15.016331 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:15 crc kubenswrapper[4816]: I0316 00:08:15.016366 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:15 crc kubenswrapper[4816]: I0316 00:08:15.016399 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:15 crc kubenswrapper[4816]: I0316 00:08:15.016423 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:15Z","lastTransitionTime":"2026-03-16T00:08:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:15 crc kubenswrapper[4816]: I0316 00:08:15.119508 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:15 crc kubenswrapper[4816]: I0316 00:08:15.119635 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:15 crc kubenswrapper[4816]: I0316 00:08:15.119661 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:15 crc kubenswrapper[4816]: I0316 00:08:15.119696 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:15 crc kubenswrapper[4816]: I0316 00:08:15.119720 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:15Z","lastTransitionTime":"2026-03-16T00:08:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:15 crc kubenswrapper[4816]: I0316 00:08:15.222819 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:15 crc kubenswrapper[4816]: I0316 00:08:15.222905 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:15 crc kubenswrapper[4816]: I0316 00:08:15.222922 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:15 crc kubenswrapper[4816]: I0316 00:08:15.222949 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:15 crc kubenswrapper[4816]: I0316 00:08:15.222967 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:15Z","lastTransitionTime":"2026-03-16T00:08:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:15 crc kubenswrapper[4816]: I0316 00:08:15.326096 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:15 crc kubenswrapper[4816]: I0316 00:08:15.326153 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:15 crc kubenswrapper[4816]: I0316 00:08:15.326170 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:15 crc kubenswrapper[4816]: I0316 00:08:15.326192 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:15 crc kubenswrapper[4816]: I0316 00:08:15.326208 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:15Z","lastTransitionTime":"2026-03-16T00:08:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:15 crc kubenswrapper[4816]: I0316 00:08:15.429436 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:15 crc kubenswrapper[4816]: I0316 00:08:15.429517 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:15 crc kubenswrapper[4816]: I0316 00:08:15.429532 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:15 crc kubenswrapper[4816]: I0316 00:08:15.429575 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:15 crc kubenswrapper[4816]: I0316 00:08:15.429589 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:15Z","lastTransitionTime":"2026-03-16T00:08:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:15 crc kubenswrapper[4816]: I0316 00:08:15.532781 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:15 crc kubenswrapper[4816]: I0316 00:08:15.532840 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:15 crc kubenswrapper[4816]: I0316 00:08:15.532863 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:15 crc kubenswrapper[4816]: I0316 00:08:15.532888 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:15 crc kubenswrapper[4816]: I0316 00:08:15.532907 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:15Z","lastTransitionTime":"2026-03-16T00:08:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:15 crc kubenswrapper[4816]: I0316 00:08:15.636013 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:15 crc kubenswrapper[4816]: I0316 00:08:15.636081 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:15 crc kubenswrapper[4816]: I0316 00:08:15.636099 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:15 crc kubenswrapper[4816]: I0316 00:08:15.636129 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:15 crc kubenswrapper[4816]: I0316 00:08:15.636148 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:15Z","lastTransitionTime":"2026-03-16T00:08:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:15 crc kubenswrapper[4816]: I0316 00:08:15.738368 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:15 crc kubenswrapper[4816]: I0316 00:08:15.738437 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:15 crc kubenswrapper[4816]: I0316 00:08:15.738455 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:15 crc kubenswrapper[4816]: I0316 00:08:15.738482 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:15 crc kubenswrapper[4816]: I0316 00:08:15.738502 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:15Z","lastTransitionTime":"2026-03-16T00:08:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:15 crc kubenswrapper[4816]: I0316 00:08:15.841347 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:15 crc kubenswrapper[4816]: I0316 00:08:15.841416 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:15 crc kubenswrapper[4816]: I0316 00:08:15.841429 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:15 crc kubenswrapper[4816]: I0316 00:08:15.841452 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:15 crc kubenswrapper[4816]: I0316 00:08:15.841468 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:15Z","lastTransitionTime":"2026-03-16T00:08:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:15 crc kubenswrapper[4816]: I0316 00:08:15.944143 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:15 crc kubenswrapper[4816]: I0316 00:08:15.944200 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:15 crc kubenswrapper[4816]: I0316 00:08:15.944209 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:15 crc kubenswrapper[4816]: I0316 00:08:15.944229 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:15 crc kubenswrapper[4816]: I0316 00:08:15.944241 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:15Z","lastTransitionTime":"2026-03-16T00:08:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:16 crc kubenswrapper[4816]: I0316 00:08:16.047205 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:16 crc kubenswrapper[4816]: I0316 00:08:16.047273 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:16 crc kubenswrapper[4816]: I0316 00:08:16.047289 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:16 crc kubenswrapper[4816]: I0316 00:08:16.047314 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:16 crc kubenswrapper[4816]: I0316 00:08:16.047337 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:16Z","lastTransitionTime":"2026-03-16T00:08:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:16 crc kubenswrapper[4816]: I0316 00:08:16.150055 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:16 crc kubenswrapper[4816]: I0316 00:08:16.150124 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:16 crc kubenswrapper[4816]: I0316 00:08:16.150141 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:16 crc kubenswrapper[4816]: I0316 00:08:16.150166 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:16 crc kubenswrapper[4816]: I0316 00:08:16.150184 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:16Z","lastTransitionTime":"2026-03-16T00:08:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:16 crc kubenswrapper[4816]: I0316 00:08:16.253002 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:16 crc kubenswrapper[4816]: I0316 00:08:16.253069 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:16 crc kubenswrapper[4816]: I0316 00:08:16.253080 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:16 crc kubenswrapper[4816]: I0316 00:08:16.253096 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:16 crc kubenswrapper[4816]: I0316 00:08:16.253140 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:16Z","lastTransitionTime":"2026-03-16T00:08:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:16 crc kubenswrapper[4816]: I0316 00:08:16.338655 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 16 00:08:16 crc kubenswrapper[4816]: E0316 00:08:16.338913 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-16 00:08:24.338886913 +0000 UTC m=+97.435186906 (durationBeforeRetry 8s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:08:16 crc kubenswrapper[4816]: I0316 00:08:16.355940 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:16 crc kubenswrapper[4816]: I0316 00:08:16.356005 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:16 crc kubenswrapper[4816]: I0316 00:08:16.356027 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:16 crc kubenswrapper[4816]: I0316 00:08:16.356066 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:16 crc kubenswrapper[4816]: I0316 00:08:16.356115 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:16Z","lastTransitionTime":"2026-03-16T00:08:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:16 crc kubenswrapper[4816]: I0316 00:08:16.439644 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 16 00:08:16 crc kubenswrapper[4816]: I0316 00:08:16.439715 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 16 00:08:16 crc kubenswrapper[4816]: I0316 00:08:16.439759 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 16 00:08:16 crc kubenswrapper[4816]: I0316 00:08:16.439800 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 16 00:08:16 crc kubenswrapper[4816]: E0316 00:08:16.439966 4816 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 16 00:08:16 crc kubenswrapper[4816]: E0316 00:08:16.440023 4816 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 16 00:08:16 crc kubenswrapper[4816]: E0316 00:08:16.440027 4816 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 16 00:08:16 crc kubenswrapper[4816]: E0316 00:08:16.440172 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-16 00:08:24.440137277 +0000 UTC m=+97.536437270 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 16 00:08:16 crc kubenswrapper[4816]: E0316 00:08:16.440047 4816 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 16 00:08:16 crc kubenswrapper[4816]: E0316 00:08:16.439991 4816 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 16 00:08:16 crc kubenswrapper[4816]: E0316 00:08:16.440273 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-16 00:08:24.440250771 +0000 UTC m=+97.536550764 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 16 00:08:16 crc kubenswrapper[4816]: E0316 00:08:16.440361 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-16 00:08:24.440332154 +0000 UTC m=+97.536632147 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 16 00:08:16 crc kubenswrapper[4816]: E0316 00:08:16.440052 4816 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 16 00:08:16 crc kubenswrapper[4816]: E0316 00:08:16.440405 4816 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 16 00:08:16 crc kubenswrapper[4816]: E0316 00:08:16.440429 4816 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 16 00:08:16 crc kubenswrapper[4816]: E0316 00:08:16.440488 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-16 00:08:24.440474189 +0000 UTC m=+97.536774172 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 16 00:08:16 crc kubenswrapper[4816]: I0316 00:08:16.458698 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:16 crc kubenswrapper[4816]: I0316 00:08:16.458784 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:16 crc kubenswrapper[4816]: I0316 00:08:16.458808 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:16 crc kubenswrapper[4816]: I0316 00:08:16.458840 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:16 crc kubenswrapper[4816]: I0316 00:08:16.458863 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:16Z","lastTransitionTime":"2026-03-16T00:08:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:16 crc kubenswrapper[4816]: I0316 00:08:16.562443 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:16 crc kubenswrapper[4816]: I0316 00:08:16.562534 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:16 crc kubenswrapper[4816]: I0316 00:08:16.562573 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:16 crc kubenswrapper[4816]: I0316 00:08:16.562608 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:16 crc kubenswrapper[4816]: I0316 00:08:16.562639 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:16Z","lastTransitionTime":"2026-03-16T00:08:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:16 crc kubenswrapper[4816]: I0316 00:08:16.665431 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:16 crc kubenswrapper[4816]: I0316 00:08:16.665493 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:16 crc kubenswrapper[4816]: I0316 00:08:16.665512 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:16 crc kubenswrapper[4816]: I0316 00:08:16.665539 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:16 crc kubenswrapper[4816]: I0316 00:08:16.665593 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:16Z","lastTransitionTime":"2026-03-16T00:08:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:16 crc kubenswrapper[4816]: I0316 00:08:16.667073 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 16 00:08:16 crc kubenswrapper[4816]: I0316 00:08:16.667158 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 16 00:08:16 crc kubenswrapper[4816]: E0316 00:08:16.667323 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 16 00:08:16 crc kubenswrapper[4816]: I0316 00:08:16.667429 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 16 00:08:16 crc kubenswrapper[4816]: E0316 00:08:16.667662 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 16 00:08:16 crc kubenswrapper[4816]: E0316 00:08:16.667787 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 16 00:08:16 crc kubenswrapper[4816]: I0316 00:08:16.768600 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:16 crc kubenswrapper[4816]: I0316 00:08:16.768671 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:16 crc kubenswrapper[4816]: I0316 00:08:16.768692 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:16 crc kubenswrapper[4816]: I0316 00:08:16.768737 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:16 crc kubenswrapper[4816]: I0316 00:08:16.768765 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:16Z","lastTransitionTime":"2026-03-16T00:08:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:16 crc kubenswrapper[4816]: I0316 00:08:16.872192 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:16 crc kubenswrapper[4816]: I0316 00:08:16.872251 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:16 crc kubenswrapper[4816]: I0316 00:08:16.872263 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:16 crc kubenswrapper[4816]: I0316 00:08:16.872281 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:16 crc kubenswrapper[4816]: I0316 00:08:16.872293 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:16Z","lastTransitionTime":"2026-03-16T00:08:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:16 crc kubenswrapper[4816]: I0316 00:08:16.976100 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:16 crc kubenswrapper[4816]: I0316 00:08:16.976159 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:16 crc kubenswrapper[4816]: I0316 00:08:16.976172 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:16 crc kubenswrapper[4816]: I0316 00:08:16.976194 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:16 crc kubenswrapper[4816]: I0316 00:08:16.976210 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:16Z","lastTransitionTime":"2026-03-16T00:08:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:16 crc kubenswrapper[4816]: I0316 00:08:16.997613 4816 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Mar 16 00:08:17 crc kubenswrapper[4816]: I0316 00:08:17.079836 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:17 crc kubenswrapper[4816]: I0316 00:08:17.079906 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:17 crc kubenswrapper[4816]: I0316 00:08:17.079923 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:17 crc kubenswrapper[4816]: I0316 00:08:17.079949 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:17 crc kubenswrapper[4816]: I0316 00:08:17.079968 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:17Z","lastTransitionTime":"2026-03-16T00:08:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:17 crc kubenswrapper[4816]: I0316 00:08:17.182598 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:17 crc kubenswrapper[4816]: I0316 00:08:17.182672 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:17 crc kubenswrapper[4816]: I0316 00:08:17.182689 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:17 crc kubenswrapper[4816]: I0316 00:08:17.182715 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:17 crc kubenswrapper[4816]: I0316 00:08:17.182734 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:17Z","lastTransitionTime":"2026-03-16T00:08:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:17 crc kubenswrapper[4816]: I0316 00:08:17.285765 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:17 crc kubenswrapper[4816]: I0316 00:08:17.285836 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:17 crc kubenswrapper[4816]: I0316 00:08:17.285847 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:17 crc kubenswrapper[4816]: I0316 00:08:17.285868 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:17 crc kubenswrapper[4816]: I0316 00:08:17.285883 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:17Z","lastTransitionTime":"2026-03-16T00:08:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:17 crc kubenswrapper[4816]: I0316 00:08:17.389136 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:17 crc kubenswrapper[4816]: I0316 00:08:17.389265 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:17 crc kubenswrapper[4816]: I0316 00:08:17.389288 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:17 crc kubenswrapper[4816]: I0316 00:08:17.389312 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:17 crc kubenswrapper[4816]: I0316 00:08:17.389326 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:17Z","lastTransitionTime":"2026-03-16T00:08:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:17 crc kubenswrapper[4816]: I0316 00:08:17.491810 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:17 crc kubenswrapper[4816]: I0316 00:08:17.491867 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:17 crc kubenswrapper[4816]: I0316 00:08:17.491883 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:17 crc kubenswrapper[4816]: I0316 00:08:17.491904 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:17 crc kubenswrapper[4816]: I0316 00:08:17.491916 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:17Z","lastTransitionTime":"2026-03-16T00:08:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:17 crc kubenswrapper[4816]: I0316 00:08:17.594216 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:17 crc kubenswrapper[4816]: I0316 00:08:17.594267 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:17 crc kubenswrapper[4816]: I0316 00:08:17.594287 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:17 crc kubenswrapper[4816]: I0316 00:08:17.594312 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:17 crc kubenswrapper[4816]: I0316 00:08:17.594334 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:17Z","lastTransitionTime":"2026-03-16T00:08:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:17 crc kubenswrapper[4816]: I0316 00:08:17.686014 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 16 00:08:17 crc kubenswrapper[4816]: I0316 00:08:17.697434 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:17 crc kubenswrapper[4816]: I0316 00:08:17.697509 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:17 crc kubenswrapper[4816]: I0316 00:08:17.697535 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:17 crc kubenswrapper[4816]: I0316 00:08:17.697597 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:17 crc kubenswrapper[4816]: I0316 00:08:17.697618 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:17Z","lastTransitionTime":"2026-03-16T00:08:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:17 crc kubenswrapper[4816]: I0316 00:08:17.706951 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 16 00:08:17 crc kubenswrapper[4816]: I0316 00:08:17.722018 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 16 00:08:17 crc kubenswrapper[4816]: I0316 00:08:17.732119 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 16 00:08:17 crc kubenswrapper[4816]: I0316 00:08:17.741465 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 16 00:08:17 crc kubenswrapper[4816]: I0316 00:08:17.749357 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 16 00:08:17 crc kubenswrapper[4816]: I0316 00:08:17.799763 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:17 crc kubenswrapper[4816]: I0316 00:08:17.799818 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:17 crc kubenswrapper[4816]: I0316 00:08:17.799842 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:17 crc kubenswrapper[4816]: I0316 00:08:17.799866 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:17 crc kubenswrapper[4816]: I0316 00:08:17.799884 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:17Z","lastTransitionTime":"2026-03-16T00:08:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:17 crc kubenswrapper[4816]: I0316 00:08:17.906994 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:17 crc kubenswrapper[4816]: I0316 00:08:17.907617 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:17 crc kubenswrapper[4816]: I0316 00:08:17.907680 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:17 crc kubenswrapper[4816]: I0316 00:08:17.907718 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:17 crc kubenswrapper[4816]: I0316 00:08:17.907742 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:17Z","lastTransitionTime":"2026-03-16T00:08:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:18 crc kubenswrapper[4816]: I0316 00:08:18.010606 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:18 crc kubenswrapper[4816]: I0316 00:08:18.010690 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:18 crc kubenswrapper[4816]: I0316 00:08:18.010710 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:18 crc kubenswrapper[4816]: I0316 00:08:18.010736 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:18 crc kubenswrapper[4816]: I0316 00:08:18.010753 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:18Z","lastTransitionTime":"2026-03-16T00:08:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:18 crc kubenswrapper[4816]: I0316 00:08:18.112895 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:18 crc kubenswrapper[4816]: I0316 00:08:18.112945 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:18 crc kubenswrapper[4816]: I0316 00:08:18.112956 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:18 crc kubenswrapper[4816]: I0316 00:08:18.112974 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:18 crc kubenswrapper[4816]: I0316 00:08:18.112986 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:18Z","lastTransitionTime":"2026-03-16T00:08:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:18 crc kubenswrapper[4816]: I0316 00:08:18.215991 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:18 crc kubenswrapper[4816]: I0316 00:08:18.216050 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:18 crc kubenswrapper[4816]: I0316 00:08:18.216062 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:18 crc kubenswrapper[4816]: I0316 00:08:18.216077 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:18 crc kubenswrapper[4816]: I0316 00:08:18.216085 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:18Z","lastTransitionTime":"2026-03-16T00:08:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:18 crc kubenswrapper[4816]: I0316 00:08:18.318407 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:18 crc kubenswrapper[4816]: I0316 00:08:18.318460 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:18 crc kubenswrapper[4816]: I0316 00:08:18.318477 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:18 crc kubenswrapper[4816]: I0316 00:08:18.318500 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:18 crc kubenswrapper[4816]: I0316 00:08:18.318519 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:18Z","lastTransitionTime":"2026-03-16T00:08:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:18 crc kubenswrapper[4816]: I0316 00:08:18.374455 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:18 crc kubenswrapper[4816]: I0316 00:08:18.374502 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:18 crc kubenswrapper[4816]: I0316 00:08:18.374514 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:18 crc kubenswrapper[4816]: I0316 00:08:18.374534 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:18 crc kubenswrapper[4816]: I0316 00:08:18.374574 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:18Z","lastTransitionTime":"2026-03-16T00:08:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:18 crc kubenswrapper[4816]: E0316 00:08:18.389134 4816 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:08:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:08:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:18Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:08:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:08:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:18Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d8fb348c-4907-4c8f-859a-735976530e03\\\",\\\"systemUUID\\\":\\\"e97abf98-298f-4589-a70a-4cfb5cb2994a\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 16 00:08:18 crc kubenswrapper[4816]: I0316 00:08:18.394855 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:18 crc kubenswrapper[4816]: I0316 00:08:18.394914 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:18 crc kubenswrapper[4816]: I0316 00:08:18.394934 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:18 crc kubenswrapper[4816]: I0316 00:08:18.394960 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:18 crc kubenswrapper[4816]: I0316 00:08:18.394977 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:18Z","lastTransitionTime":"2026-03-16T00:08:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:18 crc kubenswrapper[4816]: E0316 00:08:18.410253 4816 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:08:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:08:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:18Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:08:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:08:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:18Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d8fb348c-4907-4c8f-859a-735976530e03\\\",\\\"systemUUID\\\":\\\"e97abf98-298f-4589-a70a-4cfb5cb2994a\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 16 00:08:18 crc kubenswrapper[4816]: I0316 00:08:18.417203 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:18 crc kubenswrapper[4816]: I0316 00:08:18.417250 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:18 crc kubenswrapper[4816]: I0316 00:08:18.417261 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:18 crc kubenswrapper[4816]: I0316 00:08:18.417279 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:18 crc kubenswrapper[4816]: I0316 00:08:18.417290 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:18Z","lastTransitionTime":"2026-03-16T00:08:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:18 crc kubenswrapper[4816]: E0316 00:08:18.433076 4816 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:08:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:08:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:18Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:08:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:08:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:18Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d8fb348c-4907-4c8f-859a-735976530e03\\\",\\\"systemUUID\\\":\\\"e97abf98-298f-4589-a70a-4cfb5cb2994a\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 16 00:08:18 crc kubenswrapper[4816]: I0316 00:08:18.437762 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:18 crc kubenswrapper[4816]: I0316 00:08:18.437796 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:18 crc kubenswrapper[4816]: I0316 00:08:18.437808 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:18 crc kubenswrapper[4816]: I0316 00:08:18.437824 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:18 crc kubenswrapper[4816]: I0316 00:08:18.437836 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:18Z","lastTransitionTime":"2026-03-16T00:08:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:18 crc kubenswrapper[4816]: E0316 00:08:18.453051 4816 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:08:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:08:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:18Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:08:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:08:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:18Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d8fb348c-4907-4c8f-859a-735976530e03\\\",\\\"systemUUID\\\":\\\"e97abf98-298f-4589-a70a-4cfb5cb2994a\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 16 00:08:18 crc kubenswrapper[4816]: I0316 00:08:18.457133 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:18 crc kubenswrapper[4816]: I0316 00:08:18.457226 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:18 crc kubenswrapper[4816]: I0316 00:08:18.457241 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:18 crc kubenswrapper[4816]: I0316 00:08:18.457285 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:18 crc kubenswrapper[4816]: I0316 00:08:18.457302 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:18Z","lastTransitionTime":"2026-03-16T00:08:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:18 crc kubenswrapper[4816]: E0316 00:08:18.469394 4816 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:08:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:08:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:18Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:08:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:08:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:18Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d8fb348c-4907-4c8f-859a-735976530e03\\\",\\\"systemUUID\\\":\\\"e97abf98-298f-4589-a70a-4cfb5cb2994a\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 16 00:08:18 crc kubenswrapper[4816]: E0316 00:08:18.470209 4816 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 16 00:08:18 crc kubenswrapper[4816]: I0316 00:08:18.472587 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:18 crc kubenswrapper[4816]: I0316 00:08:18.472635 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:18 crc kubenswrapper[4816]: I0316 00:08:18.472655 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:18 crc kubenswrapper[4816]: I0316 00:08:18.472682 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:18 crc kubenswrapper[4816]: I0316 00:08:18.472701 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:18Z","lastTransitionTime":"2026-03-16T00:08:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:18 crc kubenswrapper[4816]: I0316 00:08:18.576013 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:18 crc kubenswrapper[4816]: I0316 00:08:18.576061 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:18 crc kubenswrapper[4816]: I0316 00:08:18.576074 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:18 crc kubenswrapper[4816]: I0316 00:08:18.576094 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:18 crc kubenswrapper[4816]: I0316 00:08:18.576107 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:18Z","lastTransitionTime":"2026-03-16T00:08:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:18 crc kubenswrapper[4816]: I0316 00:08:18.667432 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 16 00:08:18 crc kubenswrapper[4816]: I0316 00:08:18.667518 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 16 00:08:18 crc kubenswrapper[4816]: I0316 00:08:18.667452 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 16 00:08:18 crc kubenswrapper[4816]: E0316 00:08:18.667633 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 16 00:08:18 crc kubenswrapper[4816]: E0316 00:08:18.667745 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 16 00:08:18 crc kubenswrapper[4816]: E0316 00:08:18.667863 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 16 00:08:18 crc kubenswrapper[4816]: I0316 00:08:18.678840 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:18 crc kubenswrapper[4816]: I0316 00:08:18.678904 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:18 crc kubenswrapper[4816]: I0316 00:08:18.678931 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:18 crc kubenswrapper[4816]: I0316 00:08:18.678959 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:18 crc kubenswrapper[4816]: I0316 00:08:18.678980 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:18Z","lastTransitionTime":"2026-03-16T00:08:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:18 crc kubenswrapper[4816]: I0316 00:08:18.781600 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:18 crc kubenswrapper[4816]: I0316 00:08:18.781642 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:18 crc kubenswrapper[4816]: I0316 00:08:18.781653 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:18 crc kubenswrapper[4816]: I0316 00:08:18.781670 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:18 crc kubenswrapper[4816]: I0316 00:08:18.781681 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:18Z","lastTransitionTime":"2026-03-16T00:08:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:18 crc kubenswrapper[4816]: I0316 00:08:18.884247 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:18 crc kubenswrapper[4816]: I0316 00:08:18.884313 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:18 crc kubenswrapper[4816]: I0316 00:08:18.884337 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:18 crc kubenswrapper[4816]: I0316 00:08:18.884372 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:18 crc kubenswrapper[4816]: I0316 00:08:18.884396 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:18Z","lastTransitionTime":"2026-03-16T00:08:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:18 crc kubenswrapper[4816]: I0316 00:08:18.986842 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:18 crc kubenswrapper[4816]: I0316 00:08:18.986914 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:18 crc kubenswrapper[4816]: I0316 00:08:18.986932 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:18 crc kubenswrapper[4816]: I0316 00:08:18.986955 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:18 crc kubenswrapper[4816]: I0316 00:08:18.986973 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:18Z","lastTransitionTime":"2026-03-16T00:08:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:19 crc kubenswrapper[4816]: I0316 00:08:19.089646 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:19 crc kubenswrapper[4816]: I0316 00:08:19.089689 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:19 crc kubenswrapper[4816]: I0316 00:08:19.089699 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:19 crc kubenswrapper[4816]: I0316 00:08:19.089715 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:19 crc kubenswrapper[4816]: I0316 00:08:19.089725 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:19Z","lastTransitionTime":"2026-03-16T00:08:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:19 crc kubenswrapper[4816]: I0316 00:08:19.192859 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:19 crc kubenswrapper[4816]: I0316 00:08:19.192924 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:19 crc kubenswrapper[4816]: I0316 00:08:19.192938 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:19 crc kubenswrapper[4816]: I0316 00:08:19.192955 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:19 crc kubenswrapper[4816]: I0316 00:08:19.192964 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:19Z","lastTransitionTime":"2026-03-16T00:08:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:19 crc kubenswrapper[4816]: I0316 00:08:19.295910 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:19 crc kubenswrapper[4816]: I0316 00:08:19.295959 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:19 crc kubenswrapper[4816]: I0316 00:08:19.295971 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:19 crc kubenswrapper[4816]: I0316 00:08:19.295988 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:19 crc kubenswrapper[4816]: I0316 00:08:19.296000 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:19Z","lastTransitionTime":"2026-03-16T00:08:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:19 crc kubenswrapper[4816]: I0316 00:08:19.398661 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:19 crc kubenswrapper[4816]: I0316 00:08:19.398725 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:19 crc kubenswrapper[4816]: I0316 00:08:19.398747 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:19 crc kubenswrapper[4816]: I0316 00:08:19.398776 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:19 crc kubenswrapper[4816]: I0316 00:08:19.398796 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:19Z","lastTransitionTime":"2026-03-16T00:08:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:19 crc kubenswrapper[4816]: I0316 00:08:19.501172 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:19 crc kubenswrapper[4816]: I0316 00:08:19.501250 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:19 crc kubenswrapper[4816]: I0316 00:08:19.501292 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:19 crc kubenswrapper[4816]: I0316 00:08:19.501323 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:19 crc kubenswrapper[4816]: I0316 00:08:19.501345 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:19Z","lastTransitionTime":"2026-03-16T00:08:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:19 crc kubenswrapper[4816]: I0316 00:08:19.604411 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:19 crc kubenswrapper[4816]: I0316 00:08:19.604479 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:19 crc kubenswrapper[4816]: I0316 00:08:19.604500 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:19 crc kubenswrapper[4816]: I0316 00:08:19.604529 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:19 crc kubenswrapper[4816]: I0316 00:08:19.604590 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:19Z","lastTransitionTime":"2026-03-16T00:08:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:19 crc kubenswrapper[4816]: I0316 00:08:19.706982 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:19 crc kubenswrapper[4816]: I0316 00:08:19.707031 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:19 crc kubenswrapper[4816]: I0316 00:08:19.707043 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:19 crc kubenswrapper[4816]: I0316 00:08:19.707061 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:19 crc kubenswrapper[4816]: I0316 00:08:19.707074 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:19Z","lastTransitionTime":"2026-03-16T00:08:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:19 crc kubenswrapper[4816]: I0316 00:08:19.809338 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:19 crc kubenswrapper[4816]: I0316 00:08:19.809383 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:19 crc kubenswrapper[4816]: I0316 00:08:19.809396 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:19 crc kubenswrapper[4816]: I0316 00:08:19.809414 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:19 crc kubenswrapper[4816]: I0316 00:08:19.809426 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:19Z","lastTransitionTime":"2026-03-16T00:08:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:19 crc kubenswrapper[4816]: I0316 00:08:19.911589 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:19 crc kubenswrapper[4816]: I0316 00:08:19.911747 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:19 crc kubenswrapper[4816]: I0316 00:08:19.911761 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:19 crc kubenswrapper[4816]: I0316 00:08:19.911784 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:19 crc kubenswrapper[4816]: I0316 00:08:19.911797 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:19Z","lastTransitionTime":"2026-03-16T00:08:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:20 crc kubenswrapper[4816]: I0316 00:08:20.015072 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:20 crc kubenswrapper[4816]: I0316 00:08:20.015142 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:20 crc kubenswrapper[4816]: I0316 00:08:20.015173 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:20 crc kubenswrapper[4816]: I0316 00:08:20.015207 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:20 crc kubenswrapper[4816]: I0316 00:08:20.015228 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:20Z","lastTransitionTime":"2026-03-16T00:08:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:20 crc kubenswrapper[4816]: I0316 00:08:20.118146 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:20 crc kubenswrapper[4816]: I0316 00:08:20.118207 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:20 crc kubenswrapper[4816]: I0316 00:08:20.118225 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:20 crc kubenswrapper[4816]: I0316 00:08:20.118381 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:20 crc kubenswrapper[4816]: I0316 00:08:20.118440 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:20Z","lastTransitionTime":"2026-03-16T00:08:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:20 crc kubenswrapper[4816]: I0316 00:08:20.221316 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:20 crc kubenswrapper[4816]: I0316 00:08:20.221364 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:20 crc kubenswrapper[4816]: I0316 00:08:20.221376 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:20 crc kubenswrapper[4816]: I0316 00:08:20.221395 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:20 crc kubenswrapper[4816]: I0316 00:08:20.221409 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:20Z","lastTransitionTime":"2026-03-16T00:08:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:20 crc kubenswrapper[4816]: I0316 00:08:20.324306 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:20 crc kubenswrapper[4816]: I0316 00:08:20.324358 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:20 crc kubenswrapper[4816]: I0316 00:08:20.324376 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:20 crc kubenswrapper[4816]: I0316 00:08:20.324399 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:20 crc kubenswrapper[4816]: I0316 00:08:20.324415 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:20Z","lastTransitionTime":"2026-03-16T00:08:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:20 crc kubenswrapper[4816]: I0316 00:08:20.427058 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:20 crc kubenswrapper[4816]: I0316 00:08:20.427101 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:20 crc kubenswrapper[4816]: I0316 00:08:20.427112 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:20 crc kubenswrapper[4816]: I0316 00:08:20.427128 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:20 crc kubenswrapper[4816]: I0316 00:08:20.427140 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:20Z","lastTransitionTime":"2026-03-16T00:08:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:20 crc kubenswrapper[4816]: I0316 00:08:20.529501 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:20 crc kubenswrapper[4816]: I0316 00:08:20.529575 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:20 crc kubenswrapper[4816]: I0316 00:08:20.529589 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:20 crc kubenswrapper[4816]: I0316 00:08:20.529610 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:20 crc kubenswrapper[4816]: I0316 00:08:20.529623 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:20Z","lastTransitionTime":"2026-03-16T00:08:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:20 crc kubenswrapper[4816]: I0316 00:08:20.632395 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:20 crc kubenswrapper[4816]: I0316 00:08:20.632441 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:20 crc kubenswrapper[4816]: I0316 00:08:20.632458 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:20 crc kubenswrapper[4816]: I0316 00:08:20.632488 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:20 crc kubenswrapper[4816]: I0316 00:08:20.632513 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:20Z","lastTransitionTime":"2026-03-16T00:08:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:20 crc kubenswrapper[4816]: I0316 00:08:20.667398 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 16 00:08:20 crc kubenswrapper[4816]: I0316 00:08:20.667500 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 16 00:08:20 crc kubenswrapper[4816]: I0316 00:08:20.667605 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 16 00:08:20 crc kubenswrapper[4816]: E0316 00:08:20.667517 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 16 00:08:20 crc kubenswrapper[4816]: E0316 00:08:20.667718 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 16 00:08:20 crc kubenswrapper[4816]: E0316 00:08:20.668039 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 16 00:08:20 crc kubenswrapper[4816]: I0316 00:08:20.685515 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Mar 16 00:08:20 crc kubenswrapper[4816]: I0316 00:08:20.685805 4816 scope.go:117] "RemoveContainer" containerID="29c4319e41bd025417b11cb87b682a66698ba7575801f247b1192dc8067ab66f" Mar 16 00:08:20 crc kubenswrapper[4816]: E0316 00:08:20.685950 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 16 00:08:20 crc kubenswrapper[4816]: I0316 00:08:20.734285 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:20 crc kubenswrapper[4816]: I0316 00:08:20.734324 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:20 crc kubenswrapper[4816]: I0316 00:08:20.734337 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:20 crc kubenswrapper[4816]: I0316 00:08:20.734355 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:20 crc kubenswrapper[4816]: I0316 00:08:20.734367 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:20Z","lastTransitionTime":"2026-03-16T00:08:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:20 crc kubenswrapper[4816]: I0316 00:08:20.838083 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:20 crc kubenswrapper[4816]: I0316 00:08:20.838135 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:20 crc kubenswrapper[4816]: I0316 00:08:20.838146 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:20 crc kubenswrapper[4816]: I0316 00:08:20.838169 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:20 crc kubenswrapper[4816]: I0316 00:08:20.838182 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:20Z","lastTransitionTime":"2026-03-16T00:08:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:20 crc kubenswrapper[4816]: I0316 00:08:20.940758 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:20 crc kubenswrapper[4816]: I0316 00:08:20.940825 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:20 crc kubenswrapper[4816]: I0316 00:08:20.940887 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:20 crc kubenswrapper[4816]: I0316 00:08:20.940914 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:20 crc kubenswrapper[4816]: I0316 00:08:20.940937 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:20Z","lastTransitionTime":"2026-03-16T00:08:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:21 crc kubenswrapper[4816]: I0316 00:08:21.043468 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:21 crc kubenswrapper[4816]: I0316 00:08:21.043503 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:21 crc kubenswrapper[4816]: I0316 00:08:21.043512 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:21 crc kubenswrapper[4816]: I0316 00:08:21.043527 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:21 crc kubenswrapper[4816]: I0316 00:08:21.043537 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:21Z","lastTransitionTime":"2026-03-16T00:08:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:21 crc kubenswrapper[4816]: I0316 00:08:21.067916 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"37869baff974556c22aadadfa4b528b5d6b9ad236107b4dbc945b3cdbf743f20"} Mar 16 00:08:21 crc kubenswrapper[4816]: I0316 00:08:21.068263 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"2f9fe727708fdaca68596b7d305629ade45b061a88a81085f885d490ab99e24b"} Mar 16 00:08:21 crc kubenswrapper[4816]: I0316 00:08:21.068564 4816 scope.go:117] "RemoveContainer" containerID="29c4319e41bd025417b11cb87b682a66698ba7575801f247b1192dc8067ab66f" Mar 16 00:08:21 crc kubenswrapper[4816]: E0316 00:08:21.068754 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 16 00:08:21 crc kubenswrapper[4816]: I0316 00:08:21.084748 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 16 00:08:21 crc kubenswrapper[4816]: I0316 00:08:21.099231 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c8786d1-025d-4788-bafe-c2c2eaf8e398\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://46db23088e8ba791d736f70a65bd066af80a5ec27c31332d9ac9c52530b21bfd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://da7b257af12b9ee605dc32a26d9565f866a9cafebb4936a0bde7f42ae9909f80\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2523ea43f2490afb81354be8a2840f54a3be1a8d3154196e0c8ac365d09723a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://29c4319e41bd025417b11cb87b682a66698ba7575801f247b1192dc8067ab66f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://29c4319e41bd025417b11cb87b682a66698ba7575801f247b1192dc8067ab66f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-16T00:07:59Z\\\",\\\"message\\\":\\\"le observer\\\\nW0316 00:07:59.373922 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0316 00:07:59.374075 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0316 00:07:59.374860 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2696044728/tls.crt::/tmp/serving-cert-2696044728/tls.key\\\\\\\"\\\\nI0316 00:07:59.560928 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0316 00:07:59.563996 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0316 00:07:59.564013 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0316 00:07:59.564035 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0316 00:07:59.564041 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0316 00:07:59.571475 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0316 00:07:59.571523 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0316 00:07:59.571531 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0316 00:07:59.571540 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0316 00:07:59.571574 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0316 00:07:59.571588 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nI0316 00:07:59.571493 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0316 00:07:59.571597 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0316 00:07:59.573484 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-16T00:07:58Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c81d0c40be9c3912864280cd98481f837ef29f69d85599b39decf5e9d7a97eff\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:50Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e39d816de31fa1c42a254ec61527496e7e85121f67007a774524167b1063cf1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e39d816de31fa1c42a254ec61527496e7e85121f67007a774524167b1063cf1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:06:47Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 16 00:08:21 crc kubenswrapper[4816]: I0316 00:08:21.114758 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 16 00:08:21 crc kubenswrapper[4816]: I0316 00:08:21.131770 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 16 00:08:21 crc kubenswrapper[4816]: I0316 00:08:21.144366 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 16 00:08:21 crc kubenswrapper[4816]: I0316 00:08:21.146402 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:21 crc kubenswrapper[4816]: I0316 00:08:21.146441 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:21 crc kubenswrapper[4816]: I0316 00:08:21.146449 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:21 crc kubenswrapper[4816]: I0316 00:08:21.146465 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:21 crc kubenswrapper[4816]: I0316 00:08:21.146474 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:21Z","lastTransitionTime":"2026-03-16T00:08:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:21 crc kubenswrapper[4816]: I0316 00:08:21.158745 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 16 00:08:21 crc kubenswrapper[4816]: I0316 00:08:21.173578 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://37869baff974556c22aadadfa4b528b5d6b9ad236107b4dbc945b3cdbf743f20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f9fe727708fdaca68596b7d305629ade45b061a88a81085f885d490ab99e24b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 16 00:08:21 crc kubenswrapper[4816]: I0316 00:08:21.248935 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:21 crc kubenswrapper[4816]: I0316 00:08:21.249027 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:21 crc kubenswrapper[4816]: I0316 00:08:21.249046 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:21 crc kubenswrapper[4816]: I0316 00:08:21.249072 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:21 crc kubenswrapper[4816]: I0316 00:08:21.249093 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:21Z","lastTransitionTime":"2026-03-16T00:08:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:21 crc kubenswrapper[4816]: I0316 00:08:21.353057 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:21 crc kubenswrapper[4816]: I0316 00:08:21.353115 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:21 crc kubenswrapper[4816]: I0316 00:08:21.353131 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:21 crc kubenswrapper[4816]: I0316 00:08:21.353155 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:21 crc kubenswrapper[4816]: I0316 00:08:21.353172 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:21Z","lastTransitionTime":"2026-03-16T00:08:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:21 crc kubenswrapper[4816]: I0316 00:08:21.366881 4816 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Mar 16 00:08:21 crc kubenswrapper[4816]: I0316 00:08:21.456127 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:21 crc kubenswrapper[4816]: I0316 00:08:21.456204 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:21 crc kubenswrapper[4816]: I0316 00:08:21.456226 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:21 crc kubenswrapper[4816]: I0316 00:08:21.456255 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:21 crc kubenswrapper[4816]: I0316 00:08:21.456274 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:21Z","lastTransitionTime":"2026-03-16T00:08:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:21 crc kubenswrapper[4816]: I0316 00:08:21.558812 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:21 crc kubenswrapper[4816]: I0316 00:08:21.558865 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:21 crc kubenswrapper[4816]: I0316 00:08:21.558878 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:21 crc kubenswrapper[4816]: I0316 00:08:21.558901 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:21 crc kubenswrapper[4816]: I0316 00:08:21.558914 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:21Z","lastTransitionTime":"2026-03-16T00:08:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:21 crc kubenswrapper[4816]: I0316 00:08:21.662158 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:21 crc kubenswrapper[4816]: I0316 00:08:21.662339 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:21 crc kubenswrapper[4816]: I0316 00:08:21.662378 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:21 crc kubenswrapper[4816]: I0316 00:08:21.662413 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:21 crc kubenswrapper[4816]: I0316 00:08:21.662438 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:21Z","lastTransitionTime":"2026-03-16T00:08:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:21 crc kubenswrapper[4816]: I0316 00:08:21.697061 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd/etcd-crc"] Mar 16 00:08:21 crc kubenswrapper[4816]: I0316 00:08:21.765132 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:21 crc kubenswrapper[4816]: I0316 00:08:21.765174 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:21 crc kubenswrapper[4816]: I0316 00:08:21.765186 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:21 crc kubenswrapper[4816]: I0316 00:08:21.765206 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:21 crc kubenswrapper[4816]: I0316 00:08:21.765218 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:21Z","lastTransitionTime":"2026-03-16T00:08:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:21 crc kubenswrapper[4816]: I0316 00:08:21.868657 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:21 crc kubenswrapper[4816]: I0316 00:08:21.868726 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:21 crc kubenswrapper[4816]: I0316 00:08:21.868739 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:21 crc kubenswrapper[4816]: I0316 00:08:21.868757 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:21 crc kubenswrapper[4816]: I0316 00:08:21.868769 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:21Z","lastTransitionTime":"2026-03-16T00:08:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:21 crc kubenswrapper[4816]: I0316 00:08:21.971652 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:21 crc kubenswrapper[4816]: I0316 00:08:21.971700 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:21 crc kubenswrapper[4816]: I0316 00:08:21.971714 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:21 crc kubenswrapper[4816]: I0316 00:08:21.971734 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:21 crc kubenswrapper[4816]: I0316 00:08:21.971747 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:21Z","lastTransitionTime":"2026-03-16T00:08:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:22 crc kubenswrapper[4816]: I0316 00:08:22.071979 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"723654bc91ce4d40a27b14a9c4df1ed6c8dff2517f378927d7a6a96aeaa6951b"} Mar 16 00:08:22 crc kubenswrapper[4816]: I0316 00:08:22.075805 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:22 crc kubenswrapper[4816]: I0316 00:08:22.075854 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:22 crc kubenswrapper[4816]: I0316 00:08:22.075868 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:22 crc kubenswrapper[4816]: I0316 00:08:22.075886 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:22 crc kubenswrapper[4816]: I0316 00:08:22.075899 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:22Z","lastTransitionTime":"2026-03-16T00:08:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:22 crc kubenswrapper[4816]: I0316 00:08:22.101214 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"49e35ef6-7a6d-438f-b598-4445d73db379\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fbe90b49c5d42bb9d6d5d1b8f1d02b571403dd3f39d3118081351d72c4910914\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://508e6dbe6f2f1392d16501e09ee2ec523a3c2a245cd96c6ab773e26e83b8bb6a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://15b5a3223ff0fb5b96e5b87ee836add66498165759f4d6ed8cb6413f091967e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://603bdcbd4a6af5fda7ca09e4823e0db8d7ec3e3eb38eddf9f062a14b433cb094\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8b061f16a65a27b9a5ef66231376b70fec084479a991b7fdd430957df76da32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b19c46be8bb0c6604bb4f702d14472b782280ce716a3b5fe87a4fa4f1b15b174\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b19c46be8bb0c6604bb4f702d14472b782280ce716a3b5fe87a4fa4f1b15b174\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://baceacd0418c0bbcbd2e875008286443b263336ba956d8e4584147ed949b3f69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://baceacd0418c0bbcbd2e875008286443b263336ba956d8e4584147ed949b3f69\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:49Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://24e7adc2ddce8a281cd270db47b3528d2a4965ab6d41a14ecb5aaea02daf0c45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://24e7adc2ddce8a281cd270db47b3528d2a4965ab6d41a14ecb5aaea02daf0c45\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:06:47Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:22Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:22 crc kubenswrapper[4816]: I0316 00:08:22.117371 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c8786d1-025d-4788-bafe-c2c2eaf8e398\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://46db23088e8ba791d736f70a65bd066af80a5ec27c31332d9ac9c52530b21bfd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://da7b257af12b9ee605dc32a26d9565f866a9cafebb4936a0bde7f42ae9909f80\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2523ea43f2490afb81354be8a2840f54a3be1a8d3154196e0c8ac365d09723a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://29c4319e41bd025417b11cb87b682a66698ba7575801f247b1192dc8067ab66f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://29c4319e41bd025417b11cb87b682a66698ba7575801f247b1192dc8067ab66f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-16T00:07:59Z\\\",\\\"message\\\":\\\"le observer\\\\nW0316 00:07:59.373922 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0316 00:07:59.374075 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0316 00:07:59.374860 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2696044728/tls.crt::/tmp/serving-cert-2696044728/tls.key\\\\\\\"\\\\nI0316 00:07:59.560928 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0316 00:07:59.563996 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0316 00:07:59.564013 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0316 00:07:59.564035 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0316 00:07:59.564041 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0316 00:07:59.571475 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0316 00:07:59.571523 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0316 00:07:59.571531 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0316 00:07:59.571540 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0316 00:07:59.571574 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0316 00:07:59.571588 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nI0316 00:07:59.571493 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0316 00:07:59.571597 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0316 00:07:59.573484 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-16T00:07:58Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c81d0c40be9c3912864280cd98481f837ef29f69d85599b39decf5e9d7a97eff\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:50Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e39d816de31fa1c42a254ec61527496e7e85121f67007a774524167b1063cf1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e39d816de31fa1c42a254ec61527496e7e85121f67007a774524167b1063cf1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:06:47Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:22Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:22 crc kubenswrapper[4816]: I0316 00:08:22.132220 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:22Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:22 crc kubenswrapper[4816]: I0316 00:08:22.151701 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:22Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:22 crc kubenswrapper[4816]: I0316 00:08:22.172926 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:22Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:22 crc kubenswrapper[4816]: I0316 00:08:22.178438 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:22 crc kubenswrapper[4816]: I0316 00:08:22.178492 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:22 crc kubenswrapper[4816]: I0316 00:08:22.178503 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:22 crc kubenswrapper[4816]: I0316 00:08:22.178526 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:22 crc kubenswrapper[4816]: I0316 00:08:22.178538 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:22Z","lastTransitionTime":"2026-03-16T00:08:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:22 crc kubenswrapper[4816]: I0316 00:08:22.194018 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://37869baff974556c22aadadfa4b528b5d6b9ad236107b4dbc945b3cdbf743f20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f9fe727708fdaca68596b7d305629ade45b061a88a81085f885d490ab99e24b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:22Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:22 crc kubenswrapper[4816]: I0316 00:08:22.212252 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://723654bc91ce4d40a27b14a9c4df1ed6c8dff2517f378927d7a6a96aeaa6951b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:22Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:22 crc kubenswrapper[4816]: I0316 00:08:22.227269 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:22Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:22 crc kubenswrapper[4816]: I0316 00:08:22.281820 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:22 crc kubenswrapper[4816]: I0316 00:08:22.281874 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:22 crc kubenswrapper[4816]: I0316 00:08:22.281885 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:22 crc kubenswrapper[4816]: I0316 00:08:22.281905 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:22 crc kubenswrapper[4816]: I0316 00:08:22.281919 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:22Z","lastTransitionTime":"2026-03-16T00:08:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:22 crc kubenswrapper[4816]: I0316 00:08:22.384777 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:22 crc kubenswrapper[4816]: I0316 00:08:22.384814 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:22 crc kubenswrapper[4816]: I0316 00:08:22.384823 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:22 crc kubenswrapper[4816]: I0316 00:08:22.384838 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:22 crc kubenswrapper[4816]: I0316 00:08:22.384848 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:22Z","lastTransitionTime":"2026-03-16T00:08:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:22 crc kubenswrapper[4816]: I0316 00:08:22.487596 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:22 crc kubenswrapper[4816]: I0316 00:08:22.487738 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:22 crc kubenswrapper[4816]: I0316 00:08:22.487754 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:22 crc kubenswrapper[4816]: I0316 00:08:22.487774 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:22 crc kubenswrapper[4816]: I0316 00:08:22.487788 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:22Z","lastTransitionTime":"2026-03-16T00:08:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:22 crc kubenswrapper[4816]: I0316 00:08:22.591370 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:22 crc kubenswrapper[4816]: I0316 00:08:22.591423 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:22 crc kubenswrapper[4816]: I0316 00:08:22.591435 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:22 crc kubenswrapper[4816]: I0316 00:08:22.591454 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:22 crc kubenswrapper[4816]: I0316 00:08:22.591466 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:22Z","lastTransitionTime":"2026-03-16T00:08:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:22 crc kubenswrapper[4816]: I0316 00:08:22.666773 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 16 00:08:22 crc kubenswrapper[4816]: I0316 00:08:22.666776 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 16 00:08:22 crc kubenswrapper[4816]: E0316 00:08:22.666948 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 16 00:08:22 crc kubenswrapper[4816]: I0316 00:08:22.667091 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 16 00:08:22 crc kubenswrapper[4816]: E0316 00:08:22.667157 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 16 00:08:22 crc kubenswrapper[4816]: E0316 00:08:22.667345 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 16 00:08:22 crc kubenswrapper[4816]: I0316 00:08:22.693329 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:22 crc kubenswrapper[4816]: I0316 00:08:22.693379 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:22 crc kubenswrapper[4816]: I0316 00:08:22.693391 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:22 crc kubenswrapper[4816]: I0316 00:08:22.693410 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:22 crc kubenswrapper[4816]: I0316 00:08:22.693423 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:22Z","lastTransitionTime":"2026-03-16T00:08:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:22 crc kubenswrapper[4816]: I0316 00:08:22.796939 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:22 crc kubenswrapper[4816]: I0316 00:08:22.796991 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:22 crc kubenswrapper[4816]: I0316 00:08:22.797006 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:22 crc kubenswrapper[4816]: I0316 00:08:22.797036 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:22 crc kubenswrapper[4816]: I0316 00:08:22.797055 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:22Z","lastTransitionTime":"2026-03-16T00:08:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:22 crc kubenswrapper[4816]: I0316 00:08:22.900765 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:22 crc kubenswrapper[4816]: I0316 00:08:22.900826 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:22 crc kubenswrapper[4816]: I0316 00:08:22.900844 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:22 crc kubenswrapper[4816]: I0316 00:08:22.900874 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:22 crc kubenswrapper[4816]: I0316 00:08:22.900893 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:22Z","lastTransitionTime":"2026-03-16T00:08:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:23 crc kubenswrapper[4816]: I0316 00:08:23.003441 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:23 crc kubenswrapper[4816]: I0316 00:08:23.003484 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:23 crc kubenswrapper[4816]: I0316 00:08:23.003495 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:23 crc kubenswrapper[4816]: I0316 00:08:23.003513 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:23 crc kubenswrapper[4816]: I0316 00:08:23.003525 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:23Z","lastTransitionTime":"2026-03-16T00:08:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:23 crc kubenswrapper[4816]: I0316 00:08:23.106199 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:23 crc kubenswrapper[4816]: I0316 00:08:23.106259 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:23 crc kubenswrapper[4816]: I0316 00:08:23.106272 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:23 crc kubenswrapper[4816]: I0316 00:08:23.106293 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:23 crc kubenswrapper[4816]: I0316 00:08:23.106306 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:23Z","lastTransitionTime":"2026-03-16T00:08:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:23 crc kubenswrapper[4816]: I0316 00:08:23.208958 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:23 crc kubenswrapper[4816]: I0316 00:08:23.209013 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:23 crc kubenswrapper[4816]: I0316 00:08:23.209028 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:23 crc kubenswrapper[4816]: I0316 00:08:23.209053 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:23 crc kubenswrapper[4816]: I0316 00:08:23.209076 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:23Z","lastTransitionTime":"2026-03-16T00:08:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:23 crc kubenswrapper[4816]: I0316 00:08:23.311674 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:23 crc kubenswrapper[4816]: I0316 00:08:23.311725 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:23 crc kubenswrapper[4816]: I0316 00:08:23.311734 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:23 crc kubenswrapper[4816]: I0316 00:08:23.311760 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:23 crc kubenswrapper[4816]: I0316 00:08:23.311774 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:23Z","lastTransitionTime":"2026-03-16T00:08:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:23 crc kubenswrapper[4816]: I0316 00:08:23.413884 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:23 crc kubenswrapper[4816]: I0316 00:08:23.413943 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:23 crc kubenswrapper[4816]: I0316 00:08:23.413954 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:23 crc kubenswrapper[4816]: I0316 00:08:23.413975 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:23 crc kubenswrapper[4816]: I0316 00:08:23.414000 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:23Z","lastTransitionTime":"2026-03-16T00:08:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:23 crc kubenswrapper[4816]: I0316 00:08:23.516519 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:23 crc kubenswrapper[4816]: I0316 00:08:23.516602 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:23 crc kubenswrapper[4816]: I0316 00:08:23.516621 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:23 crc kubenswrapper[4816]: I0316 00:08:23.516647 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:23 crc kubenswrapper[4816]: I0316 00:08:23.516663 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:23Z","lastTransitionTime":"2026-03-16T00:08:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:23 crc kubenswrapper[4816]: I0316 00:08:23.619612 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:23 crc kubenswrapper[4816]: I0316 00:08:23.619679 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:23 crc kubenswrapper[4816]: I0316 00:08:23.619700 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:23 crc kubenswrapper[4816]: I0316 00:08:23.619729 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:23 crc kubenswrapper[4816]: I0316 00:08:23.619750 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:23Z","lastTransitionTime":"2026-03-16T00:08:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:23 crc kubenswrapper[4816]: I0316 00:08:23.722542 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:23 crc kubenswrapper[4816]: I0316 00:08:23.723176 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:23 crc kubenswrapper[4816]: I0316 00:08:23.723198 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:23 crc kubenswrapper[4816]: I0316 00:08:23.723224 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:23 crc kubenswrapper[4816]: I0316 00:08:23.723244 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:23Z","lastTransitionTime":"2026-03-16T00:08:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:23 crc kubenswrapper[4816]: I0316 00:08:23.826383 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:23 crc kubenswrapper[4816]: I0316 00:08:23.826435 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:23 crc kubenswrapper[4816]: I0316 00:08:23.826447 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:23 crc kubenswrapper[4816]: I0316 00:08:23.826466 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:23 crc kubenswrapper[4816]: I0316 00:08:23.826480 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:23Z","lastTransitionTime":"2026-03-16T00:08:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:23 crc kubenswrapper[4816]: I0316 00:08:23.929839 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:23 crc kubenswrapper[4816]: I0316 00:08:23.929886 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:23 crc kubenswrapper[4816]: I0316 00:08:23.929899 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:23 crc kubenswrapper[4816]: I0316 00:08:23.929918 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:23 crc kubenswrapper[4816]: I0316 00:08:23.929930 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:23Z","lastTransitionTime":"2026-03-16T00:08:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:24 crc kubenswrapper[4816]: I0316 00:08:24.038245 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:24 crc kubenswrapper[4816]: I0316 00:08:24.038297 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:24 crc kubenswrapper[4816]: I0316 00:08:24.038310 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:24 crc kubenswrapper[4816]: I0316 00:08:24.038328 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:24 crc kubenswrapper[4816]: I0316 00:08:24.038341 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:24Z","lastTransitionTime":"2026-03-16T00:08:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:24 crc kubenswrapper[4816]: I0316 00:08:24.140940 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:24 crc kubenswrapper[4816]: I0316 00:08:24.140988 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:24 crc kubenswrapper[4816]: I0316 00:08:24.141000 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:24 crc kubenswrapper[4816]: I0316 00:08:24.141022 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:24 crc kubenswrapper[4816]: I0316 00:08:24.141035 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:24Z","lastTransitionTime":"2026-03-16T00:08:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:24 crc kubenswrapper[4816]: I0316 00:08:24.244263 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:24 crc kubenswrapper[4816]: I0316 00:08:24.244297 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:24 crc kubenswrapper[4816]: I0316 00:08:24.244309 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:24 crc kubenswrapper[4816]: I0316 00:08:24.244328 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:24 crc kubenswrapper[4816]: I0316 00:08:24.244341 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:24Z","lastTransitionTime":"2026-03-16T00:08:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:24 crc kubenswrapper[4816]: I0316 00:08:24.346601 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:24 crc kubenswrapper[4816]: I0316 00:08:24.346631 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:24 crc kubenswrapper[4816]: I0316 00:08:24.346640 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:24 crc kubenswrapper[4816]: I0316 00:08:24.346654 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:24 crc kubenswrapper[4816]: I0316 00:08:24.346664 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:24Z","lastTransitionTime":"2026-03-16T00:08:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:24 crc kubenswrapper[4816]: I0316 00:08:24.410762 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 16 00:08:24 crc kubenswrapper[4816]: E0316 00:08:24.411000 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-16 00:08:40.410975108 +0000 UTC m=+113.507275101 (durationBeforeRetry 16s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:08:24 crc kubenswrapper[4816]: I0316 00:08:24.449603 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:24 crc kubenswrapper[4816]: I0316 00:08:24.449650 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:24 crc kubenswrapper[4816]: I0316 00:08:24.449660 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:24 crc kubenswrapper[4816]: I0316 00:08:24.449679 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:24 crc kubenswrapper[4816]: I0316 00:08:24.449690 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:24Z","lastTransitionTime":"2026-03-16T00:08:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:24 crc kubenswrapper[4816]: I0316 00:08:24.512410 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 16 00:08:24 crc kubenswrapper[4816]: I0316 00:08:24.512482 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 16 00:08:24 crc kubenswrapper[4816]: I0316 00:08:24.512528 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 16 00:08:24 crc kubenswrapper[4816]: I0316 00:08:24.512607 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 16 00:08:24 crc kubenswrapper[4816]: E0316 00:08:24.512767 4816 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 16 00:08:24 crc kubenswrapper[4816]: E0316 00:08:24.512774 4816 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 16 00:08:24 crc kubenswrapper[4816]: E0316 00:08:24.512845 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-16 00:08:40.512822182 +0000 UTC m=+113.609122175 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 16 00:08:24 crc kubenswrapper[4816]: E0316 00:08:24.512854 4816 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 16 00:08:24 crc kubenswrapper[4816]: E0316 00:08:24.512881 4816 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 16 00:08:24 crc kubenswrapper[4816]: E0316 00:08:24.512988 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-16 00:08:40.512950886 +0000 UTC m=+113.609250989 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 16 00:08:24 crc kubenswrapper[4816]: E0316 00:08:24.513082 4816 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 16 00:08:24 crc kubenswrapper[4816]: E0316 00:08:24.513125 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-16 00:08:40.513112881 +0000 UTC m=+113.609413074 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 16 00:08:24 crc kubenswrapper[4816]: E0316 00:08:24.513213 4816 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 16 00:08:24 crc kubenswrapper[4816]: E0316 00:08:24.513237 4816 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 16 00:08:24 crc kubenswrapper[4816]: E0316 00:08:24.513252 4816 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 16 00:08:24 crc kubenswrapper[4816]: E0316 00:08:24.513291 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-16 00:08:40.513278917 +0000 UTC m=+113.609578880 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 16 00:08:24 crc kubenswrapper[4816]: I0316 00:08:24.552349 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:24 crc kubenswrapper[4816]: I0316 00:08:24.552402 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:24 crc kubenswrapper[4816]: I0316 00:08:24.552415 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:24 crc kubenswrapper[4816]: I0316 00:08:24.552435 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:24 crc kubenswrapper[4816]: I0316 00:08:24.552448 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:24Z","lastTransitionTime":"2026-03-16T00:08:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:24 crc kubenswrapper[4816]: I0316 00:08:24.656671 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:24 crc kubenswrapper[4816]: I0316 00:08:24.656736 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:24 crc kubenswrapper[4816]: I0316 00:08:24.656757 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:24 crc kubenswrapper[4816]: I0316 00:08:24.656784 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:24 crc kubenswrapper[4816]: I0316 00:08:24.656805 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:24Z","lastTransitionTime":"2026-03-16T00:08:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:24 crc kubenswrapper[4816]: I0316 00:08:24.667676 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 16 00:08:24 crc kubenswrapper[4816]: I0316 00:08:24.667676 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 16 00:08:24 crc kubenswrapper[4816]: E0316 00:08:24.667876 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 16 00:08:24 crc kubenswrapper[4816]: I0316 00:08:24.667711 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 16 00:08:24 crc kubenswrapper[4816]: E0316 00:08:24.668028 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 16 00:08:24 crc kubenswrapper[4816]: E0316 00:08:24.668135 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 16 00:08:24 crc kubenswrapper[4816]: I0316 00:08:24.759774 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:24 crc kubenswrapper[4816]: I0316 00:08:24.759840 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:24 crc kubenswrapper[4816]: I0316 00:08:24.759860 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:24 crc kubenswrapper[4816]: I0316 00:08:24.759885 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:24 crc kubenswrapper[4816]: I0316 00:08:24.759903 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:24Z","lastTransitionTime":"2026-03-16T00:08:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:24 crc kubenswrapper[4816]: I0316 00:08:24.862734 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:24 crc kubenswrapper[4816]: I0316 00:08:24.862776 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:24 crc kubenswrapper[4816]: I0316 00:08:24.862784 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:24 crc kubenswrapper[4816]: I0316 00:08:24.862802 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:24 crc kubenswrapper[4816]: I0316 00:08:24.862812 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:24Z","lastTransitionTime":"2026-03-16T00:08:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:24 crc kubenswrapper[4816]: I0316 00:08:24.966104 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:24 crc kubenswrapper[4816]: I0316 00:08:24.966164 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:24 crc kubenswrapper[4816]: I0316 00:08:24.966182 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:24 crc kubenswrapper[4816]: I0316 00:08:24.966209 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:24 crc kubenswrapper[4816]: I0316 00:08:24.966231 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:24Z","lastTransitionTime":"2026-03-16T00:08:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:25 crc kubenswrapper[4816]: I0316 00:08:25.068910 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:25 crc kubenswrapper[4816]: I0316 00:08:25.068976 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:25 crc kubenswrapper[4816]: I0316 00:08:25.068992 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:25 crc kubenswrapper[4816]: I0316 00:08:25.069021 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:25 crc kubenswrapper[4816]: I0316 00:08:25.069038 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:25Z","lastTransitionTime":"2026-03-16T00:08:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:25 crc kubenswrapper[4816]: I0316 00:08:25.172422 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:25 crc kubenswrapper[4816]: I0316 00:08:25.172481 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:25 crc kubenswrapper[4816]: I0316 00:08:25.172496 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:25 crc kubenswrapper[4816]: I0316 00:08:25.172515 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:25 crc kubenswrapper[4816]: I0316 00:08:25.172528 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:25Z","lastTransitionTime":"2026-03-16T00:08:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:25 crc kubenswrapper[4816]: I0316 00:08:25.276398 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:25 crc kubenswrapper[4816]: I0316 00:08:25.276454 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:25 crc kubenswrapper[4816]: I0316 00:08:25.276468 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:25 crc kubenswrapper[4816]: I0316 00:08:25.276492 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:25 crc kubenswrapper[4816]: I0316 00:08:25.276507 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:25Z","lastTransitionTime":"2026-03-16T00:08:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:25 crc kubenswrapper[4816]: I0316 00:08:25.379179 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:25 crc kubenswrapper[4816]: I0316 00:08:25.379264 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:25 crc kubenswrapper[4816]: I0316 00:08:25.379285 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:25 crc kubenswrapper[4816]: I0316 00:08:25.379317 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:25 crc kubenswrapper[4816]: I0316 00:08:25.379341 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:25Z","lastTransitionTime":"2026-03-16T00:08:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:25 crc kubenswrapper[4816]: I0316 00:08:25.482932 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:25 crc kubenswrapper[4816]: I0316 00:08:25.482992 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:25 crc kubenswrapper[4816]: I0316 00:08:25.483048 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:25 crc kubenswrapper[4816]: I0316 00:08:25.483075 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:25 crc kubenswrapper[4816]: I0316 00:08:25.483093 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:25Z","lastTransitionTime":"2026-03-16T00:08:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:25 crc kubenswrapper[4816]: I0316 00:08:25.586367 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:25 crc kubenswrapper[4816]: I0316 00:08:25.586423 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:25 crc kubenswrapper[4816]: I0316 00:08:25.586441 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:25 crc kubenswrapper[4816]: I0316 00:08:25.586466 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:25 crc kubenswrapper[4816]: I0316 00:08:25.586482 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:25Z","lastTransitionTime":"2026-03-16T00:08:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:25 crc kubenswrapper[4816]: I0316 00:08:25.688836 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:25 crc kubenswrapper[4816]: I0316 00:08:25.688872 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:25 crc kubenswrapper[4816]: I0316 00:08:25.688882 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:25 crc kubenswrapper[4816]: I0316 00:08:25.688899 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:25 crc kubenswrapper[4816]: I0316 00:08:25.688910 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:25Z","lastTransitionTime":"2026-03-16T00:08:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:25 crc kubenswrapper[4816]: I0316 00:08:25.791660 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:25 crc kubenswrapper[4816]: I0316 00:08:25.791720 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:25 crc kubenswrapper[4816]: I0316 00:08:25.791734 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:25 crc kubenswrapper[4816]: I0316 00:08:25.791754 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:25 crc kubenswrapper[4816]: I0316 00:08:25.792114 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:25Z","lastTransitionTime":"2026-03-16T00:08:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:25 crc kubenswrapper[4816]: I0316 00:08:25.894951 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:25 crc kubenswrapper[4816]: I0316 00:08:25.895043 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:25 crc kubenswrapper[4816]: I0316 00:08:25.895060 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:25 crc kubenswrapper[4816]: I0316 00:08:25.895081 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:25 crc kubenswrapper[4816]: I0316 00:08:25.895094 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:25Z","lastTransitionTime":"2026-03-16T00:08:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:25 crc kubenswrapper[4816]: I0316 00:08:25.998393 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:25 crc kubenswrapper[4816]: I0316 00:08:25.998466 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:25 crc kubenswrapper[4816]: I0316 00:08:25.998483 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:25 crc kubenswrapper[4816]: I0316 00:08:25.998508 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:25 crc kubenswrapper[4816]: I0316 00:08:25.998527 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:25Z","lastTransitionTime":"2026-03-16T00:08:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:26 crc kubenswrapper[4816]: I0316 00:08:26.101050 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:26 crc kubenswrapper[4816]: I0316 00:08:26.101116 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:26 crc kubenswrapper[4816]: I0316 00:08:26.101127 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:26 crc kubenswrapper[4816]: I0316 00:08:26.101145 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:26 crc kubenswrapper[4816]: I0316 00:08:26.101161 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:26Z","lastTransitionTime":"2026-03-16T00:08:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:26 crc kubenswrapper[4816]: I0316 00:08:26.203992 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:26 crc kubenswrapper[4816]: I0316 00:08:26.204036 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:26 crc kubenswrapper[4816]: I0316 00:08:26.204047 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:26 crc kubenswrapper[4816]: I0316 00:08:26.204068 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:26 crc kubenswrapper[4816]: I0316 00:08:26.204080 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:26Z","lastTransitionTime":"2026-03-16T00:08:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:26 crc kubenswrapper[4816]: I0316 00:08:26.307148 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:26 crc kubenswrapper[4816]: I0316 00:08:26.307207 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:26 crc kubenswrapper[4816]: I0316 00:08:26.307224 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:26 crc kubenswrapper[4816]: I0316 00:08:26.307253 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:26 crc kubenswrapper[4816]: I0316 00:08:26.307270 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:26Z","lastTransitionTime":"2026-03-16T00:08:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:26 crc kubenswrapper[4816]: I0316 00:08:26.410665 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:26 crc kubenswrapper[4816]: I0316 00:08:26.410717 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:26 crc kubenswrapper[4816]: I0316 00:08:26.410728 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:26 crc kubenswrapper[4816]: I0316 00:08:26.410745 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:26 crc kubenswrapper[4816]: I0316 00:08:26.410757 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:26Z","lastTransitionTime":"2026-03-16T00:08:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:26 crc kubenswrapper[4816]: I0316 00:08:26.515003 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:26 crc kubenswrapper[4816]: I0316 00:08:26.515058 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:26 crc kubenswrapper[4816]: I0316 00:08:26.515075 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:26 crc kubenswrapper[4816]: I0316 00:08:26.515096 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:26 crc kubenswrapper[4816]: I0316 00:08:26.515114 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:26Z","lastTransitionTime":"2026-03-16T00:08:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:26 crc kubenswrapper[4816]: I0316 00:08:26.618131 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:26 crc kubenswrapper[4816]: I0316 00:08:26.618185 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:26 crc kubenswrapper[4816]: I0316 00:08:26.618198 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:26 crc kubenswrapper[4816]: I0316 00:08:26.618219 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:26 crc kubenswrapper[4816]: I0316 00:08:26.618233 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:26Z","lastTransitionTime":"2026-03-16T00:08:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:26 crc kubenswrapper[4816]: I0316 00:08:26.667493 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 16 00:08:26 crc kubenswrapper[4816]: I0316 00:08:26.667614 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 16 00:08:26 crc kubenswrapper[4816]: I0316 00:08:26.667614 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 16 00:08:26 crc kubenswrapper[4816]: E0316 00:08:26.667782 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 16 00:08:26 crc kubenswrapper[4816]: E0316 00:08:26.667962 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 16 00:08:26 crc kubenswrapper[4816]: E0316 00:08:26.668173 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 16 00:08:26 crc kubenswrapper[4816]: I0316 00:08:26.721882 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:26 crc kubenswrapper[4816]: I0316 00:08:26.721962 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:26 crc kubenswrapper[4816]: I0316 00:08:26.721985 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:26 crc kubenswrapper[4816]: I0316 00:08:26.722018 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:26 crc kubenswrapper[4816]: I0316 00:08:26.722043 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:26Z","lastTransitionTime":"2026-03-16T00:08:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:26 crc kubenswrapper[4816]: I0316 00:08:26.825806 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:26 crc kubenswrapper[4816]: I0316 00:08:26.826204 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:26 crc kubenswrapper[4816]: I0316 00:08:26.826298 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:26 crc kubenswrapper[4816]: I0316 00:08:26.826442 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:26 crc kubenswrapper[4816]: I0316 00:08:26.826536 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:26Z","lastTransitionTime":"2026-03-16T00:08:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:26 crc kubenswrapper[4816]: I0316 00:08:26.930584 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:26 crc kubenswrapper[4816]: I0316 00:08:26.930653 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:26 crc kubenswrapper[4816]: I0316 00:08:26.930666 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:26 crc kubenswrapper[4816]: I0316 00:08:26.930694 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:26 crc kubenswrapper[4816]: I0316 00:08:26.930708 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:26Z","lastTransitionTime":"2026-03-16T00:08:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:27 crc kubenswrapper[4816]: I0316 00:08:27.034542 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:27 crc kubenswrapper[4816]: I0316 00:08:27.034616 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:27 crc kubenswrapper[4816]: I0316 00:08:27.034633 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:27 crc kubenswrapper[4816]: I0316 00:08:27.034659 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:27 crc kubenswrapper[4816]: I0316 00:08:27.034680 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:27Z","lastTransitionTime":"2026-03-16T00:08:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:27 crc kubenswrapper[4816]: I0316 00:08:27.093765 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"516852108b3d17a15676573ff080d3d94fa48d8076ed0404a8d827e715c2555e"} Mar 16 00:08:27 crc kubenswrapper[4816]: I0316 00:08:27.111959 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://723654bc91ce4d40a27b14a9c4df1ed6c8dff2517f378927d7a6a96aeaa6951b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:27Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:27 crc kubenswrapper[4816]: I0316 00:08:27.132671 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:27Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:27 crc kubenswrapper[4816]: I0316 00:08:27.138119 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:27 crc kubenswrapper[4816]: I0316 00:08:27.138204 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:27 crc kubenswrapper[4816]: I0316 00:08:27.138224 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:27 crc kubenswrapper[4816]: I0316 00:08:27.138289 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:27 crc kubenswrapper[4816]: I0316 00:08:27.138307 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:27Z","lastTransitionTime":"2026-03-16T00:08:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:27 crc kubenswrapper[4816]: I0316 00:08:27.150406 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:27Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:27 crc kubenswrapper[4816]: I0316 00:08:27.164657 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://37869baff974556c22aadadfa4b528b5d6b9ad236107b4dbc945b3cdbf743f20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f9fe727708fdaca68596b7d305629ade45b061a88a81085f885d490ab99e24b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:27Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:27 crc kubenswrapper[4816]: I0316 00:08:27.179901 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://516852108b3d17a15676573ff080d3d94fa48d8076ed0404a8d827e715c2555e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:27Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:27 crc kubenswrapper[4816]: I0316 00:08:27.194117 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:27Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:27 crc kubenswrapper[4816]: I0316 00:08:27.219382 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"49e35ef6-7a6d-438f-b598-4445d73db379\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fbe90b49c5d42bb9d6d5d1b8f1d02b571403dd3f39d3118081351d72c4910914\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://508e6dbe6f2f1392d16501e09ee2ec523a3c2a245cd96c6ab773e26e83b8bb6a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://15b5a3223ff0fb5b96e5b87ee836add66498165759f4d6ed8cb6413f091967e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://603bdcbd4a6af5fda7ca09e4823e0db8d7ec3e3eb38eddf9f062a14b433cb094\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8b061f16a65a27b9a5ef66231376b70fec084479a991b7fdd430957df76da32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b19c46be8bb0c6604bb4f702d14472b782280ce716a3b5fe87a4fa4f1b15b174\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b19c46be8bb0c6604bb4f702d14472b782280ce716a3b5fe87a4fa4f1b15b174\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://baceacd0418c0bbcbd2e875008286443b263336ba956d8e4584147ed949b3f69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://baceacd0418c0bbcbd2e875008286443b263336ba956d8e4584147ed949b3f69\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:49Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://24e7adc2ddce8a281cd270db47b3528d2a4965ab6d41a14ecb5aaea02daf0c45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://24e7adc2ddce8a281cd270db47b3528d2a4965ab6d41a14ecb5aaea02daf0c45\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:06:47Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:27Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:27 crc kubenswrapper[4816]: I0316 00:08:27.236858 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c8786d1-025d-4788-bafe-c2c2eaf8e398\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://46db23088e8ba791d736f70a65bd066af80a5ec27c31332d9ac9c52530b21bfd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://da7b257af12b9ee605dc32a26d9565f866a9cafebb4936a0bde7f42ae9909f80\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2523ea43f2490afb81354be8a2840f54a3be1a8d3154196e0c8ac365d09723a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://29c4319e41bd025417b11cb87b682a66698ba7575801f247b1192dc8067ab66f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://29c4319e41bd025417b11cb87b682a66698ba7575801f247b1192dc8067ab66f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-16T00:07:59Z\\\",\\\"message\\\":\\\"le observer\\\\nW0316 00:07:59.373922 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0316 00:07:59.374075 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0316 00:07:59.374860 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2696044728/tls.crt::/tmp/serving-cert-2696044728/tls.key\\\\\\\"\\\\nI0316 00:07:59.560928 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0316 00:07:59.563996 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0316 00:07:59.564013 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0316 00:07:59.564035 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0316 00:07:59.564041 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0316 00:07:59.571475 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0316 00:07:59.571523 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0316 00:07:59.571531 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0316 00:07:59.571540 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0316 00:07:59.571574 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0316 00:07:59.571588 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nI0316 00:07:59.571493 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0316 00:07:59.571597 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0316 00:07:59.573484 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-16T00:07:58Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c81d0c40be9c3912864280cd98481f837ef29f69d85599b39decf5e9d7a97eff\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:50Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e39d816de31fa1c42a254ec61527496e7e85121f67007a774524167b1063cf1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e39d816de31fa1c42a254ec61527496e7e85121f67007a774524167b1063cf1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:06:47Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:27Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:27 crc kubenswrapper[4816]: I0316 00:08:27.241082 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:27 crc kubenswrapper[4816]: I0316 00:08:27.241141 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:27 crc kubenswrapper[4816]: I0316 00:08:27.241156 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:27 crc kubenswrapper[4816]: I0316 00:08:27.241180 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:27 crc kubenswrapper[4816]: I0316 00:08:27.241199 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:27Z","lastTransitionTime":"2026-03-16T00:08:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:27 crc kubenswrapper[4816]: I0316 00:08:27.344893 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:27 crc kubenswrapper[4816]: I0316 00:08:27.344976 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:27 crc kubenswrapper[4816]: I0316 00:08:27.344991 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:27 crc kubenswrapper[4816]: I0316 00:08:27.345018 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:27 crc kubenswrapper[4816]: I0316 00:08:27.345034 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:27Z","lastTransitionTime":"2026-03-16T00:08:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:27 crc kubenswrapper[4816]: I0316 00:08:27.448582 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:27 crc kubenswrapper[4816]: I0316 00:08:27.448638 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:27 crc kubenswrapper[4816]: I0316 00:08:27.448653 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:27 crc kubenswrapper[4816]: I0316 00:08:27.448675 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:27 crc kubenswrapper[4816]: I0316 00:08:27.448688 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:27Z","lastTransitionTime":"2026-03-16T00:08:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:27 crc kubenswrapper[4816]: I0316 00:08:27.553614 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:27 crc kubenswrapper[4816]: I0316 00:08:27.553667 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:27 crc kubenswrapper[4816]: I0316 00:08:27.553685 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:27 crc kubenswrapper[4816]: I0316 00:08:27.553715 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:27 crc kubenswrapper[4816]: I0316 00:08:27.553734 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:27Z","lastTransitionTime":"2026-03-16T00:08:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:27 crc kubenswrapper[4816]: I0316 00:08:27.656678 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:27 crc kubenswrapper[4816]: I0316 00:08:27.656748 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:27 crc kubenswrapper[4816]: I0316 00:08:27.656765 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:27 crc kubenswrapper[4816]: I0316 00:08:27.656794 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:27 crc kubenswrapper[4816]: I0316 00:08:27.656811 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:27Z","lastTransitionTime":"2026-03-16T00:08:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:27 crc kubenswrapper[4816]: I0316 00:08:27.693732 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://723654bc91ce4d40a27b14a9c4df1ed6c8dff2517f378927d7a6a96aeaa6951b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:27Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:27 crc kubenswrapper[4816]: I0316 00:08:27.716456 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:27Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:27 crc kubenswrapper[4816]: I0316 00:08:27.738613 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:27Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:27 crc kubenswrapper[4816]: I0316 00:08:27.759801 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:27 crc kubenswrapper[4816]: I0316 00:08:27.759865 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:27 crc kubenswrapper[4816]: I0316 00:08:27.759875 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:27 crc kubenswrapper[4816]: I0316 00:08:27.759895 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:27 crc kubenswrapper[4816]: I0316 00:08:27.759907 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:27Z","lastTransitionTime":"2026-03-16T00:08:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:27 crc kubenswrapper[4816]: I0316 00:08:27.761863 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://37869baff974556c22aadadfa4b528b5d6b9ad236107b4dbc945b3cdbf743f20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f9fe727708fdaca68596b7d305629ade45b061a88a81085f885d490ab99e24b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:27Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:27 crc kubenswrapper[4816]: I0316 00:08:27.801421 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"49e35ef6-7a6d-438f-b598-4445d73db379\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fbe90b49c5d42bb9d6d5d1b8f1d02b571403dd3f39d3118081351d72c4910914\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://508e6dbe6f2f1392d16501e09ee2ec523a3c2a245cd96c6ab773e26e83b8bb6a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://15b5a3223ff0fb5b96e5b87ee836add66498165759f4d6ed8cb6413f091967e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://603bdcbd4a6af5fda7ca09e4823e0db8d7ec3e3eb38eddf9f062a14b433cb094\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8b061f16a65a27b9a5ef66231376b70fec084479a991b7fdd430957df76da32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b19c46be8bb0c6604bb4f702d14472b782280ce716a3b5fe87a4fa4f1b15b174\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b19c46be8bb0c6604bb4f702d14472b782280ce716a3b5fe87a4fa4f1b15b174\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://baceacd0418c0bbcbd2e875008286443b263336ba956d8e4584147ed949b3f69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://baceacd0418c0bbcbd2e875008286443b263336ba956d8e4584147ed949b3f69\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:49Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://24e7adc2ddce8a281cd270db47b3528d2a4965ab6d41a14ecb5aaea02daf0c45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://24e7adc2ddce8a281cd270db47b3528d2a4965ab6d41a14ecb5aaea02daf0c45\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:06:47Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:27Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:27 crc kubenswrapper[4816]: I0316 00:08:27.826244 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c8786d1-025d-4788-bafe-c2c2eaf8e398\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://46db23088e8ba791d736f70a65bd066af80a5ec27c31332d9ac9c52530b21bfd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://da7b257af12b9ee605dc32a26d9565f866a9cafebb4936a0bde7f42ae9909f80\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2523ea43f2490afb81354be8a2840f54a3be1a8d3154196e0c8ac365d09723a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://29c4319e41bd025417b11cb87b682a66698ba7575801f247b1192dc8067ab66f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://29c4319e41bd025417b11cb87b682a66698ba7575801f247b1192dc8067ab66f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-16T00:07:59Z\\\",\\\"message\\\":\\\"le observer\\\\nW0316 00:07:59.373922 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0316 00:07:59.374075 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0316 00:07:59.374860 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2696044728/tls.crt::/tmp/serving-cert-2696044728/tls.key\\\\\\\"\\\\nI0316 00:07:59.560928 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0316 00:07:59.563996 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0316 00:07:59.564013 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0316 00:07:59.564035 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0316 00:07:59.564041 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0316 00:07:59.571475 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0316 00:07:59.571523 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0316 00:07:59.571531 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0316 00:07:59.571540 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0316 00:07:59.571574 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0316 00:07:59.571588 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nI0316 00:07:59.571493 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0316 00:07:59.571597 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0316 00:07:59.573484 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-16T00:07:58Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c81d0c40be9c3912864280cd98481f837ef29f69d85599b39decf5e9d7a97eff\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:50Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e39d816de31fa1c42a254ec61527496e7e85121f67007a774524167b1063cf1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e39d816de31fa1c42a254ec61527496e7e85121f67007a774524167b1063cf1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:06:47Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:27Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:27 crc kubenswrapper[4816]: I0316 00:08:27.846564 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://516852108b3d17a15676573ff080d3d94fa48d8076ed0404a8d827e715c2555e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:27Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:27 crc kubenswrapper[4816]: I0316 00:08:27.862703 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:27 crc kubenswrapper[4816]: I0316 00:08:27.862829 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:27 crc kubenswrapper[4816]: I0316 00:08:27.862959 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:27 crc kubenswrapper[4816]: I0316 00:08:27.862999 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:27 crc kubenswrapper[4816]: I0316 00:08:27.863019 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:27Z","lastTransitionTime":"2026-03-16T00:08:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:27 crc kubenswrapper[4816]: I0316 00:08:27.869728 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:27Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:27 crc kubenswrapper[4816]: I0316 00:08:27.966736 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:27 crc kubenswrapper[4816]: I0316 00:08:27.966817 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:27 crc kubenswrapper[4816]: I0316 00:08:27.966834 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:27 crc kubenswrapper[4816]: I0316 00:08:27.966865 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:27 crc kubenswrapper[4816]: I0316 00:08:27.966885 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:27Z","lastTransitionTime":"2026-03-16T00:08:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:28 crc kubenswrapper[4816]: I0316 00:08:28.070448 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:28 crc kubenswrapper[4816]: I0316 00:08:28.070514 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:28 crc kubenswrapper[4816]: I0316 00:08:28.070537 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:28 crc kubenswrapper[4816]: I0316 00:08:28.070613 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:28 crc kubenswrapper[4816]: I0316 00:08:28.070643 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:28Z","lastTransitionTime":"2026-03-16T00:08:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:28 crc kubenswrapper[4816]: I0316 00:08:28.173708 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:28 crc kubenswrapper[4816]: I0316 00:08:28.173834 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:28 crc kubenswrapper[4816]: I0316 00:08:28.173867 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:28 crc kubenswrapper[4816]: I0316 00:08:28.173902 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:28 crc kubenswrapper[4816]: I0316 00:08:28.173926 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:28Z","lastTransitionTime":"2026-03-16T00:08:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:28 crc kubenswrapper[4816]: I0316 00:08:28.276895 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:28 crc kubenswrapper[4816]: I0316 00:08:28.276985 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:28 crc kubenswrapper[4816]: I0316 00:08:28.277013 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:28 crc kubenswrapper[4816]: I0316 00:08:28.277049 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:28 crc kubenswrapper[4816]: I0316 00:08:28.277076 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:28Z","lastTransitionTime":"2026-03-16T00:08:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:28 crc kubenswrapper[4816]: I0316 00:08:28.380727 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:28 crc kubenswrapper[4816]: I0316 00:08:28.380793 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:28 crc kubenswrapper[4816]: I0316 00:08:28.380811 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:28 crc kubenswrapper[4816]: I0316 00:08:28.380836 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:28 crc kubenswrapper[4816]: I0316 00:08:28.380854 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:28Z","lastTransitionTime":"2026-03-16T00:08:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:28 crc kubenswrapper[4816]: I0316 00:08:28.483450 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:28 crc kubenswrapper[4816]: I0316 00:08:28.483507 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:28 crc kubenswrapper[4816]: I0316 00:08:28.483525 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:28 crc kubenswrapper[4816]: I0316 00:08:28.483577 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:28 crc kubenswrapper[4816]: I0316 00:08:28.483597 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:28Z","lastTransitionTime":"2026-03-16T00:08:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:28 crc kubenswrapper[4816]: I0316 00:08:28.586775 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:28 crc kubenswrapper[4816]: I0316 00:08:28.586825 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:28 crc kubenswrapper[4816]: I0316 00:08:28.586842 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:28 crc kubenswrapper[4816]: I0316 00:08:28.586864 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:28 crc kubenswrapper[4816]: I0316 00:08:28.586881 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:28Z","lastTransitionTime":"2026-03-16T00:08:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:28 crc kubenswrapper[4816]: I0316 00:08:28.634776 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:28 crc kubenswrapper[4816]: I0316 00:08:28.634836 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:28 crc kubenswrapper[4816]: I0316 00:08:28.634854 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:28 crc kubenswrapper[4816]: I0316 00:08:28.634875 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:28 crc kubenswrapper[4816]: I0316 00:08:28.634890 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:28Z","lastTransitionTime":"2026-03-16T00:08:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:28 crc kubenswrapper[4816]: E0316 00:08:28.656445 4816 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:08:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:28Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:08:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:28Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:08:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:28Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:08:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:28Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d8fb348c-4907-4c8f-859a-735976530e03\\\",\\\"systemUUID\\\":\\\"e97abf98-298f-4589-a70a-4cfb5cb2994a\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:28Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:28 crc kubenswrapper[4816]: I0316 00:08:28.661913 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:28 crc kubenswrapper[4816]: I0316 00:08:28.661990 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:28 crc kubenswrapper[4816]: I0316 00:08:28.662016 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:28 crc kubenswrapper[4816]: I0316 00:08:28.662052 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:28 crc kubenswrapper[4816]: I0316 00:08:28.662075 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:28Z","lastTransitionTime":"2026-03-16T00:08:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:28 crc kubenswrapper[4816]: I0316 00:08:28.667507 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 16 00:08:28 crc kubenswrapper[4816]: I0316 00:08:28.667602 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 16 00:08:28 crc kubenswrapper[4816]: I0316 00:08:28.667578 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 16 00:08:28 crc kubenswrapper[4816]: E0316 00:08:28.667792 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 16 00:08:28 crc kubenswrapper[4816]: E0316 00:08:28.667885 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 16 00:08:28 crc kubenswrapper[4816]: E0316 00:08:28.668102 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 16 00:08:28 crc kubenswrapper[4816]: E0316 00:08:28.694771 4816 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:08:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:28Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:08:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:28Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:08:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:28Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:08:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:28Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d8fb348c-4907-4c8f-859a-735976530e03\\\",\\\"systemUUID\\\":\\\"e97abf98-298f-4589-a70a-4cfb5cb2994a\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:28Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:28 crc kubenswrapper[4816]: I0316 00:08:28.700068 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:28 crc kubenswrapper[4816]: I0316 00:08:28.700148 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:28 crc kubenswrapper[4816]: I0316 00:08:28.700161 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:28 crc kubenswrapper[4816]: I0316 00:08:28.700188 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:28 crc kubenswrapper[4816]: I0316 00:08:28.700202 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:28Z","lastTransitionTime":"2026-03-16T00:08:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:28 crc kubenswrapper[4816]: E0316 00:08:28.722148 4816 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:08:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:28Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:08:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:28Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:08:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:28Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:08:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:28Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d8fb348c-4907-4c8f-859a-735976530e03\\\",\\\"systemUUID\\\":\\\"e97abf98-298f-4589-a70a-4cfb5cb2994a\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:28Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:28 crc kubenswrapper[4816]: I0316 00:08:28.727698 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:28 crc kubenswrapper[4816]: I0316 00:08:28.727755 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:28 crc kubenswrapper[4816]: I0316 00:08:28.727774 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:28 crc kubenswrapper[4816]: I0316 00:08:28.727800 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:28 crc kubenswrapper[4816]: I0316 00:08:28.727818 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:28Z","lastTransitionTime":"2026-03-16T00:08:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:28 crc kubenswrapper[4816]: E0316 00:08:28.752034 4816 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:08:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:28Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:08:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:28Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:08:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:28Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:08:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:28Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d8fb348c-4907-4c8f-859a-735976530e03\\\",\\\"systemUUID\\\":\\\"e97abf98-298f-4589-a70a-4cfb5cb2994a\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:28Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:28 crc kubenswrapper[4816]: I0316 00:08:28.757608 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:28 crc kubenswrapper[4816]: I0316 00:08:28.757672 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:28 crc kubenswrapper[4816]: I0316 00:08:28.757690 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:28 crc kubenswrapper[4816]: I0316 00:08:28.757714 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:28 crc kubenswrapper[4816]: I0316 00:08:28.757732 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:28Z","lastTransitionTime":"2026-03-16T00:08:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:28 crc kubenswrapper[4816]: E0316 00:08:28.778512 4816 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:08:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:28Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:08:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:28Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:08:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:28Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:08:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:28Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d8fb348c-4907-4c8f-859a-735976530e03\\\",\\\"systemUUID\\\":\\\"e97abf98-298f-4589-a70a-4cfb5cb2994a\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:28Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:28 crc kubenswrapper[4816]: E0316 00:08:28.778832 4816 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 16 00:08:28 crc kubenswrapper[4816]: I0316 00:08:28.781203 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:28 crc kubenswrapper[4816]: I0316 00:08:28.781280 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:28 crc kubenswrapper[4816]: I0316 00:08:28.781296 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:28 crc kubenswrapper[4816]: I0316 00:08:28.781322 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:28 crc kubenswrapper[4816]: I0316 00:08:28.781339 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:28Z","lastTransitionTime":"2026-03-16T00:08:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:28 crc kubenswrapper[4816]: I0316 00:08:28.885169 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:28 crc kubenswrapper[4816]: I0316 00:08:28.885237 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:28 crc kubenswrapper[4816]: I0316 00:08:28.885254 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:28 crc kubenswrapper[4816]: I0316 00:08:28.885278 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:28 crc kubenswrapper[4816]: I0316 00:08:28.885295 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:28Z","lastTransitionTime":"2026-03-16T00:08:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:28 crc kubenswrapper[4816]: I0316 00:08:28.989101 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:28 crc kubenswrapper[4816]: I0316 00:08:28.989171 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:28 crc kubenswrapper[4816]: I0316 00:08:28.989182 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:28 crc kubenswrapper[4816]: I0316 00:08:28.989203 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:28 crc kubenswrapper[4816]: I0316 00:08:28.989215 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:28Z","lastTransitionTime":"2026-03-16T00:08:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:29 crc kubenswrapper[4816]: I0316 00:08:29.093042 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:29 crc kubenswrapper[4816]: I0316 00:08:29.093115 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:29 crc kubenswrapper[4816]: I0316 00:08:29.093136 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:29 crc kubenswrapper[4816]: I0316 00:08:29.093165 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:29 crc kubenswrapper[4816]: I0316 00:08:29.093186 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:29Z","lastTransitionTime":"2026-03-16T00:08:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:29 crc kubenswrapper[4816]: I0316 00:08:29.195716 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:29 crc kubenswrapper[4816]: I0316 00:08:29.195821 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:29 crc kubenswrapper[4816]: I0316 00:08:29.195841 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:29 crc kubenswrapper[4816]: I0316 00:08:29.195867 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:29 crc kubenswrapper[4816]: I0316 00:08:29.195886 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:29Z","lastTransitionTime":"2026-03-16T00:08:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:29 crc kubenswrapper[4816]: I0316 00:08:29.298792 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:29 crc kubenswrapper[4816]: I0316 00:08:29.298852 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:29 crc kubenswrapper[4816]: I0316 00:08:29.298872 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:29 crc kubenswrapper[4816]: I0316 00:08:29.298896 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:29 crc kubenswrapper[4816]: I0316 00:08:29.298913 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:29Z","lastTransitionTime":"2026-03-16T00:08:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:29 crc kubenswrapper[4816]: I0316 00:08:29.401586 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:29 crc kubenswrapper[4816]: I0316 00:08:29.401629 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:29 crc kubenswrapper[4816]: I0316 00:08:29.401638 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:29 crc kubenswrapper[4816]: I0316 00:08:29.401655 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:29 crc kubenswrapper[4816]: I0316 00:08:29.401664 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:29Z","lastTransitionTime":"2026-03-16T00:08:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:29 crc kubenswrapper[4816]: I0316 00:08:29.504142 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:29 crc kubenswrapper[4816]: I0316 00:08:29.504187 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:29 crc kubenswrapper[4816]: I0316 00:08:29.504202 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:29 crc kubenswrapper[4816]: I0316 00:08:29.504224 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:29 crc kubenswrapper[4816]: I0316 00:08:29.504238 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:29Z","lastTransitionTime":"2026-03-16T00:08:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:29 crc kubenswrapper[4816]: I0316 00:08:29.607737 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:29 crc kubenswrapper[4816]: I0316 00:08:29.607818 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:29 crc kubenswrapper[4816]: I0316 00:08:29.607848 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:29 crc kubenswrapper[4816]: I0316 00:08:29.607881 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:29 crc kubenswrapper[4816]: I0316 00:08:29.607907 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:29Z","lastTransitionTime":"2026-03-16T00:08:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:29 crc kubenswrapper[4816]: I0316 00:08:29.711228 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:29 crc kubenswrapper[4816]: I0316 00:08:29.711280 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:29 crc kubenswrapper[4816]: I0316 00:08:29.711297 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:29 crc kubenswrapper[4816]: I0316 00:08:29.711321 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:29 crc kubenswrapper[4816]: I0316 00:08:29.711340 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:29Z","lastTransitionTime":"2026-03-16T00:08:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:29 crc kubenswrapper[4816]: I0316 00:08:29.814639 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:29 crc kubenswrapper[4816]: I0316 00:08:29.814708 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:29 crc kubenswrapper[4816]: I0316 00:08:29.814734 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:29 crc kubenswrapper[4816]: I0316 00:08:29.814764 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:29 crc kubenswrapper[4816]: I0316 00:08:29.814786 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:29Z","lastTransitionTime":"2026-03-16T00:08:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:29 crc kubenswrapper[4816]: I0316 00:08:29.917384 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:29 crc kubenswrapper[4816]: I0316 00:08:29.917454 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:29 crc kubenswrapper[4816]: I0316 00:08:29.917477 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:29 crc kubenswrapper[4816]: I0316 00:08:29.917508 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:29 crc kubenswrapper[4816]: I0316 00:08:29.917532 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:29Z","lastTransitionTime":"2026-03-16T00:08:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:30 crc kubenswrapper[4816]: I0316 00:08:30.020374 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:30 crc kubenswrapper[4816]: I0316 00:08:30.020783 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:30 crc kubenswrapper[4816]: I0316 00:08:30.020918 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:30 crc kubenswrapper[4816]: I0316 00:08:30.021055 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:30 crc kubenswrapper[4816]: I0316 00:08:30.021180 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:30Z","lastTransitionTime":"2026-03-16T00:08:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:30 crc kubenswrapper[4816]: I0316 00:08:30.124926 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:30 crc kubenswrapper[4816]: I0316 00:08:30.124966 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:30 crc kubenswrapper[4816]: I0316 00:08:30.124979 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:30 crc kubenswrapper[4816]: I0316 00:08:30.125026 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:30 crc kubenswrapper[4816]: I0316 00:08:30.125042 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:30Z","lastTransitionTime":"2026-03-16T00:08:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:30 crc kubenswrapper[4816]: I0316 00:08:30.228071 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:30 crc kubenswrapper[4816]: I0316 00:08:30.228153 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:30 crc kubenswrapper[4816]: I0316 00:08:30.228174 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:30 crc kubenswrapper[4816]: I0316 00:08:30.228203 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:30 crc kubenswrapper[4816]: I0316 00:08:30.228221 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:30Z","lastTransitionTime":"2026-03-16T00:08:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:30 crc kubenswrapper[4816]: I0316 00:08:30.331198 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:30 crc kubenswrapper[4816]: I0316 00:08:30.331263 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:30 crc kubenswrapper[4816]: I0316 00:08:30.331282 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:30 crc kubenswrapper[4816]: I0316 00:08:30.331309 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:30 crc kubenswrapper[4816]: I0316 00:08:30.331328 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:30Z","lastTransitionTime":"2026-03-16T00:08:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:30 crc kubenswrapper[4816]: I0316 00:08:30.434801 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:30 crc kubenswrapper[4816]: I0316 00:08:30.434916 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:30 crc kubenswrapper[4816]: I0316 00:08:30.434939 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:30 crc kubenswrapper[4816]: I0316 00:08:30.434968 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:30 crc kubenswrapper[4816]: I0316 00:08:30.434986 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:30Z","lastTransitionTime":"2026-03-16T00:08:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:30 crc kubenswrapper[4816]: I0316 00:08:30.538309 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:30 crc kubenswrapper[4816]: I0316 00:08:30.538383 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:30 crc kubenswrapper[4816]: I0316 00:08:30.538401 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:30 crc kubenswrapper[4816]: I0316 00:08:30.538427 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:30 crc kubenswrapper[4816]: I0316 00:08:30.538446 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:30Z","lastTransitionTime":"2026-03-16T00:08:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:30 crc kubenswrapper[4816]: I0316 00:08:30.641782 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:30 crc kubenswrapper[4816]: I0316 00:08:30.641859 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:30 crc kubenswrapper[4816]: I0316 00:08:30.641884 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:30 crc kubenswrapper[4816]: I0316 00:08:30.641915 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:30 crc kubenswrapper[4816]: I0316 00:08:30.641938 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:30Z","lastTransitionTime":"2026-03-16T00:08:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:30 crc kubenswrapper[4816]: I0316 00:08:30.666713 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 16 00:08:30 crc kubenswrapper[4816]: I0316 00:08:30.666757 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 16 00:08:30 crc kubenswrapper[4816]: I0316 00:08:30.666757 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 16 00:08:30 crc kubenswrapper[4816]: E0316 00:08:30.667402 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 16 00:08:30 crc kubenswrapper[4816]: E0316 00:08:30.667611 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 16 00:08:30 crc kubenswrapper[4816]: E0316 00:08:30.667895 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 16 00:08:30 crc kubenswrapper[4816]: I0316 00:08:30.744951 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:30 crc kubenswrapper[4816]: I0316 00:08:30.745014 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:30 crc kubenswrapper[4816]: I0316 00:08:30.745032 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:30 crc kubenswrapper[4816]: I0316 00:08:30.745057 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:30 crc kubenswrapper[4816]: I0316 00:08:30.745079 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:30Z","lastTransitionTime":"2026-03-16T00:08:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:30 crc kubenswrapper[4816]: I0316 00:08:30.848506 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:30 crc kubenswrapper[4816]: I0316 00:08:30.848584 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:30 crc kubenswrapper[4816]: I0316 00:08:30.848597 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:30 crc kubenswrapper[4816]: I0316 00:08:30.848622 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:30 crc kubenswrapper[4816]: I0316 00:08:30.848639 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:30Z","lastTransitionTime":"2026-03-16T00:08:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:30 crc kubenswrapper[4816]: I0316 00:08:30.951911 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:30 crc kubenswrapper[4816]: I0316 00:08:30.951962 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:30 crc kubenswrapper[4816]: I0316 00:08:30.951978 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:30 crc kubenswrapper[4816]: I0316 00:08:30.952003 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:30 crc kubenswrapper[4816]: I0316 00:08:30.952023 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:30Z","lastTransitionTime":"2026-03-16T00:08:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:31 crc kubenswrapper[4816]: I0316 00:08:31.055605 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:31 crc kubenswrapper[4816]: I0316 00:08:31.055675 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:31 crc kubenswrapper[4816]: I0316 00:08:31.055693 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:31 crc kubenswrapper[4816]: I0316 00:08:31.055718 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:31 crc kubenswrapper[4816]: I0316 00:08:31.055736 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:31Z","lastTransitionTime":"2026-03-16T00:08:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:31 crc kubenswrapper[4816]: I0316 00:08:31.080011 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-cnhkf"] Mar 16 00:08:31 crc kubenswrapper[4816]: I0316 00:08:31.080490 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-cnhkf" Mar 16 00:08:31 crc kubenswrapper[4816]: I0316 00:08:31.083131 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Mar 16 00:08:31 crc kubenswrapper[4816]: I0316 00:08:31.083192 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Mar 16 00:08:31 crc kubenswrapper[4816]: I0316 00:08:31.083355 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Mar 16 00:08:31 crc kubenswrapper[4816]: I0316 00:08:31.106167 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://37869baff974556c22aadadfa4b528b5d6b9ad236107b4dbc945b3cdbf743f20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f9fe727708fdaca68596b7d305629ade45b061a88a81085f885d490ab99e24b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:31Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:31 crc kubenswrapper[4816]: I0316 00:08:31.127612 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-cnhkf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e686cd4-bddf-463e-b471-e49ea862691e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bwzt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:08:31Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-cnhkf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:31Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:31 crc kubenswrapper[4816]: I0316 00:08:31.146446 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://723654bc91ce4d40a27b14a9c4df1ed6c8dff2517f378927d7a6a96aeaa6951b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:31Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:31 crc kubenswrapper[4816]: I0316 00:08:31.158812 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:31 crc kubenswrapper[4816]: I0316 00:08:31.158876 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:31 crc kubenswrapper[4816]: I0316 00:08:31.158900 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:31 crc kubenswrapper[4816]: I0316 00:08:31.158930 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:31 crc kubenswrapper[4816]: I0316 00:08:31.158951 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:31Z","lastTransitionTime":"2026-03-16T00:08:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:31 crc kubenswrapper[4816]: I0316 00:08:31.164953 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:31Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:31 crc kubenswrapper[4816]: I0316 00:08:31.179380 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/3e686cd4-bddf-463e-b471-e49ea862691e-hosts-file\") pod \"node-resolver-cnhkf\" (UID: \"3e686cd4-bddf-463e-b471-e49ea862691e\") " pod="openshift-dns/node-resolver-cnhkf" Mar 16 00:08:31 crc kubenswrapper[4816]: I0316 00:08:31.179479 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9bwzt\" (UniqueName: \"kubernetes.io/projected/3e686cd4-bddf-463e-b471-e49ea862691e-kube-api-access-9bwzt\") pod \"node-resolver-cnhkf\" (UID: \"3e686cd4-bddf-463e-b471-e49ea862691e\") " pod="openshift-dns/node-resolver-cnhkf" Mar 16 00:08:31 crc kubenswrapper[4816]: I0316 00:08:31.192293 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:31Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:31 crc kubenswrapper[4816]: I0316 00:08:31.220300 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"49e35ef6-7a6d-438f-b598-4445d73db379\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fbe90b49c5d42bb9d6d5d1b8f1d02b571403dd3f39d3118081351d72c4910914\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://508e6dbe6f2f1392d16501e09ee2ec523a3c2a245cd96c6ab773e26e83b8bb6a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://15b5a3223ff0fb5b96e5b87ee836add66498165759f4d6ed8cb6413f091967e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://603bdcbd4a6af5fda7ca09e4823e0db8d7ec3e3eb38eddf9f062a14b433cb094\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8b061f16a65a27b9a5ef66231376b70fec084479a991b7fdd430957df76da32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b19c46be8bb0c6604bb4f702d14472b782280ce716a3b5fe87a4fa4f1b15b174\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b19c46be8bb0c6604bb4f702d14472b782280ce716a3b5fe87a4fa4f1b15b174\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://baceacd0418c0bbcbd2e875008286443b263336ba956d8e4584147ed949b3f69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://baceacd0418c0bbcbd2e875008286443b263336ba956d8e4584147ed949b3f69\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:49Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://24e7adc2ddce8a281cd270db47b3528d2a4965ab6d41a14ecb5aaea02daf0c45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://24e7adc2ddce8a281cd270db47b3528d2a4965ab6d41a14ecb5aaea02daf0c45\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:06:47Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:31Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:31 crc kubenswrapper[4816]: I0316 00:08:31.236453 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c8786d1-025d-4788-bafe-c2c2eaf8e398\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://46db23088e8ba791d736f70a65bd066af80a5ec27c31332d9ac9c52530b21bfd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://da7b257af12b9ee605dc32a26d9565f866a9cafebb4936a0bde7f42ae9909f80\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2523ea43f2490afb81354be8a2840f54a3be1a8d3154196e0c8ac365d09723a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://29c4319e41bd025417b11cb87b682a66698ba7575801f247b1192dc8067ab66f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://29c4319e41bd025417b11cb87b682a66698ba7575801f247b1192dc8067ab66f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-16T00:07:59Z\\\",\\\"message\\\":\\\"le observer\\\\nW0316 00:07:59.373922 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0316 00:07:59.374075 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0316 00:07:59.374860 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2696044728/tls.crt::/tmp/serving-cert-2696044728/tls.key\\\\\\\"\\\\nI0316 00:07:59.560928 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0316 00:07:59.563996 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0316 00:07:59.564013 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0316 00:07:59.564035 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0316 00:07:59.564041 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0316 00:07:59.571475 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0316 00:07:59.571523 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0316 00:07:59.571531 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0316 00:07:59.571540 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0316 00:07:59.571574 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0316 00:07:59.571588 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nI0316 00:07:59.571493 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0316 00:07:59.571597 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0316 00:07:59.573484 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-16T00:07:58Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c81d0c40be9c3912864280cd98481f837ef29f69d85599b39decf5e9d7a97eff\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:50Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e39d816de31fa1c42a254ec61527496e7e85121f67007a774524167b1063cf1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e39d816de31fa1c42a254ec61527496e7e85121f67007a774524167b1063cf1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:06:47Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:31Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:31 crc kubenswrapper[4816]: I0316 00:08:31.251117 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://516852108b3d17a15676573ff080d3d94fa48d8076ed0404a8d827e715c2555e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:31Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:31 crc kubenswrapper[4816]: I0316 00:08:31.261749 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:31 crc kubenswrapper[4816]: I0316 00:08:31.261817 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:31 crc kubenswrapper[4816]: I0316 00:08:31.261840 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:31 crc kubenswrapper[4816]: I0316 00:08:31.261864 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:31 crc kubenswrapper[4816]: I0316 00:08:31.261880 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:31Z","lastTransitionTime":"2026-03-16T00:08:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:31 crc kubenswrapper[4816]: I0316 00:08:31.267508 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:31Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:31 crc kubenswrapper[4816]: I0316 00:08:31.280977 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/3e686cd4-bddf-463e-b471-e49ea862691e-hosts-file\") pod \"node-resolver-cnhkf\" (UID: \"3e686cd4-bddf-463e-b471-e49ea862691e\") " pod="openshift-dns/node-resolver-cnhkf" Mar 16 00:08:31 crc kubenswrapper[4816]: I0316 00:08:31.281216 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9bwzt\" (UniqueName: \"kubernetes.io/projected/3e686cd4-bddf-463e-b471-e49ea862691e-kube-api-access-9bwzt\") pod \"node-resolver-cnhkf\" (UID: \"3e686cd4-bddf-463e-b471-e49ea862691e\") " pod="openshift-dns/node-resolver-cnhkf" Mar 16 00:08:31 crc kubenswrapper[4816]: I0316 00:08:31.281219 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/3e686cd4-bddf-463e-b471-e49ea862691e-hosts-file\") pod \"node-resolver-cnhkf\" (UID: \"3e686cd4-bddf-463e-b471-e49ea862691e\") " pod="openshift-dns/node-resolver-cnhkf" Mar 16 00:08:31 crc kubenswrapper[4816]: I0316 00:08:31.310246 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9bwzt\" (UniqueName: \"kubernetes.io/projected/3e686cd4-bddf-463e-b471-e49ea862691e-kube-api-access-9bwzt\") pod \"node-resolver-cnhkf\" (UID: \"3e686cd4-bddf-463e-b471-e49ea862691e\") " pod="openshift-dns/node-resolver-cnhkf" Mar 16 00:08:31 crc kubenswrapper[4816]: I0316 00:08:31.364890 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:31 crc kubenswrapper[4816]: I0316 00:08:31.364936 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:31 crc kubenswrapper[4816]: I0316 00:08:31.364950 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:31 crc kubenswrapper[4816]: I0316 00:08:31.364970 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:31 crc kubenswrapper[4816]: I0316 00:08:31.364984 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:31Z","lastTransitionTime":"2026-03-16T00:08:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:31 crc kubenswrapper[4816]: I0316 00:08:31.402404 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-cnhkf" Mar 16 00:08:31 crc kubenswrapper[4816]: W0316 00:08:31.416987 4816 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3e686cd4_bddf_463e_b471_e49ea862691e.slice/crio-ee4ee15b9147a70ac3d21b58b9d3cb23b0646c2fdf284bbf82e86743ee26a221 WatchSource:0}: Error finding container ee4ee15b9147a70ac3d21b58b9d3cb23b0646c2fdf284bbf82e86743ee26a221: Status 404 returned error can't find the container with id ee4ee15b9147a70ac3d21b58b9d3cb23b0646c2fdf284bbf82e86743ee26a221 Mar 16 00:08:31 crc kubenswrapper[4816]: I0316 00:08:31.467532 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-additional-cni-plugins-mt7bq"] Mar 16 00:08:31 crc kubenswrapper[4816]: I0316 00:08:31.468358 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-daemon-jrdcz"] Mar 16 00:08:31 crc kubenswrapper[4816]: I0316 00:08:31.468536 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-szscw"] Mar 16 00:08:31 crc kubenswrapper[4816]: I0316 00:08:31.468799 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-szscw" Mar 16 00:08:31 crc kubenswrapper[4816]: I0316 00:08:31.469187 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-mt7bq" Mar 16 00:08:31 crc kubenswrapper[4816]: I0316 00:08:31.469861 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-jrdcz" Mar 16 00:08:31 crc kubenswrapper[4816]: I0316 00:08:31.470587 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:31 crc kubenswrapper[4816]: I0316 00:08:31.470627 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:31 crc kubenswrapper[4816]: I0316 00:08:31.470641 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:31 crc kubenswrapper[4816]: I0316 00:08:31.470659 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:31 crc kubenswrapper[4816]: I0316 00:08:31.470672 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:31Z","lastTransitionTime":"2026-03-16T00:08:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:31 crc kubenswrapper[4816]: I0316 00:08:31.474472 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Mar 16 00:08:31 crc kubenswrapper[4816]: I0316 00:08:31.474925 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Mar 16 00:08:31 crc kubenswrapper[4816]: I0316 00:08:31.475120 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Mar 16 00:08:31 crc kubenswrapper[4816]: I0316 00:08:31.475117 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Mar 16 00:08:31 crc kubenswrapper[4816]: I0316 00:08:31.475304 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Mar 16 00:08:31 crc kubenswrapper[4816]: I0316 00:08:31.475238 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Mar 16 00:08:31 crc kubenswrapper[4816]: I0316 00:08:31.475583 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Mar 16 00:08:31 crc kubenswrapper[4816]: I0316 00:08:31.475643 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Mar 16 00:08:31 crc kubenswrapper[4816]: I0316 00:08:31.475545 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Mar 16 00:08:31 crc kubenswrapper[4816]: I0316 00:08:31.475796 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Mar 16 00:08:31 crc kubenswrapper[4816]: I0316 00:08:31.508156 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Mar 16 00:08:31 crc kubenswrapper[4816]: I0316 00:08:31.508482 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Mar 16 00:08:31 crc kubenswrapper[4816]: I0316 00:08:31.543207 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://723654bc91ce4d40a27b14a9c4df1ed6c8dff2517f378927d7a6a96aeaa6951b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:31Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:31 crc kubenswrapper[4816]: I0316 00:08:31.567215 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://37869baff974556c22aadadfa4b528b5d6b9ad236107b4dbc945b3cdbf743f20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f9fe727708fdaca68596b7d305629ade45b061a88a81085f885d490ab99e24b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:31Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:31 crc kubenswrapper[4816]: I0316 00:08:31.574031 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:31 crc kubenswrapper[4816]: I0316 00:08:31.574077 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:31 crc kubenswrapper[4816]: I0316 00:08:31.574089 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:31 crc kubenswrapper[4816]: I0316 00:08:31.574110 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:31 crc kubenswrapper[4816]: I0316 00:08:31.574123 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:31Z","lastTransitionTime":"2026-03-16T00:08:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:31 crc kubenswrapper[4816]: I0316 00:08:31.578785 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-cnhkf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e686cd4-bddf-463e-b471-e49ea862691e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bwzt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:08:31Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-cnhkf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:31Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:31 crc kubenswrapper[4816]: I0316 00:08:31.592454 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/e9789e58-12c8-4831-9401-af48a3e92209-host-var-lib-cni-bin\") pod \"multus-szscw\" (UID: \"e9789e58-12c8-4831-9401-af48a3e92209\") " pod="openshift-multus/multus-szscw" Mar 16 00:08:31 crc kubenswrapper[4816]: I0316 00:08:31.592508 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/03ef49f1-0c6a-443a-8df3-2db339c562ed-cnibin\") pod \"multus-additional-cni-plugins-mt7bq\" (UID: \"03ef49f1-0c6a-443a-8df3-2db339c562ed\") " pod="openshift-multus/multus-additional-cni-plugins-mt7bq" Mar 16 00:08:31 crc kubenswrapper[4816]: I0316 00:08:31.592545 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/03ef49f1-0c6a-443a-8df3-2db339c562ed-tuning-conf-dir\") pod \"multus-additional-cni-plugins-mt7bq\" (UID: \"03ef49f1-0c6a-443a-8df3-2db339c562ed\") " pod="openshift-multus/multus-additional-cni-plugins-mt7bq" Mar 16 00:08:31 crc kubenswrapper[4816]: I0316 00:08:31.592609 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/e9789e58-12c8-4831-9401-af48a3e92209-host-run-netns\") pod \"multus-szscw\" (UID: \"e9789e58-12c8-4831-9401-af48a3e92209\") " pod="openshift-multus/multus-szscw" Mar 16 00:08:31 crc kubenswrapper[4816]: I0316 00:08:31.592643 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-trwbh\" (UniqueName: \"kubernetes.io/projected/dd08ece2-7636-4966-973a-e96a34b70b53-kube-api-access-trwbh\") pod \"machine-config-daemon-jrdcz\" (UID: \"dd08ece2-7636-4966-973a-e96a34b70b53\") " pod="openshift-machine-config-operator/machine-config-daemon-jrdcz" Mar 16 00:08:31 crc kubenswrapper[4816]: I0316 00:08:31.592743 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/e9789e58-12c8-4831-9401-af48a3e92209-multus-socket-dir-parent\") pod \"multus-szscw\" (UID: \"e9789e58-12c8-4831-9401-af48a3e92209\") " pod="openshift-multus/multus-szscw" Mar 16 00:08:31 crc kubenswrapper[4816]: I0316 00:08:31.592834 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/e9789e58-12c8-4831-9401-af48a3e92209-host-var-lib-cni-multus\") pod \"multus-szscw\" (UID: \"e9789e58-12c8-4831-9401-af48a3e92209\") " pod="openshift-multus/multus-szscw" Mar 16 00:08:31 crc kubenswrapper[4816]: I0316 00:08:31.592863 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/e9789e58-12c8-4831-9401-af48a3e92209-host-var-lib-kubelet\") pod \"multus-szscw\" (UID: \"e9789e58-12c8-4831-9401-af48a3e92209\") " pod="openshift-multus/multus-szscw" Mar 16 00:08:31 crc kubenswrapper[4816]: I0316 00:08:31.592890 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e9789e58-12c8-4831-9401-af48a3e92209-etc-kubernetes\") pod \"multus-szscw\" (UID: \"e9789e58-12c8-4831-9401-af48a3e92209\") " pod="openshift-multus/multus-szscw" Mar 16 00:08:31 crc kubenswrapper[4816]: I0316 00:08:31.592924 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/dd08ece2-7636-4966-973a-e96a34b70b53-mcd-auth-proxy-config\") pod \"machine-config-daemon-jrdcz\" (UID: \"dd08ece2-7636-4966-973a-e96a34b70b53\") " pod="openshift-machine-config-operator/machine-config-daemon-jrdcz" Mar 16 00:08:31 crc kubenswrapper[4816]: I0316 00:08:31.593009 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xfvdb\" (UniqueName: \"kubernetes.io/projected/03ef49f1-0c6a-443a-8df3-2db339c562ed-kube-api-access-xfvdb\") pod \"multus-additional-cni-plugins-mt7bq\" (UID: \"03ef49f1-0c6a-443a-8df3-2db339c562ed\") " pod="openshift-multus/multus-additional-cni-plugins-mt7bq" Mar 16 00:08:31 crc kubenswrapper[4816]: I0316 00:08:31.593112 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/e9789e58-12c8-4831-9401-af48a3e92209-multus-conf-dir\") pod \"multus-szscw\" (UID: \"e9789e58-12c8-4831-9401-af48a3e92209\") " pod="openshift-multus/multus-szscw" Mar 16 00:08:31 crc kubenswrapper[4816]: I0316 00:08:31.593156 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/e9789e58-12c8-4831-9401-af48a3e92209-multus-daemon-config\") pod \"multus-szscw\" (UID: \"e9789e58-12c8-4831-9401-af48a3e92209\") " pod="openshift-multus/multus-szscw" Mar 16 00:08:31 crc kubenswrapper[4816]: I0316 00:08:31.593197 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/03ef49f1-0c6a-443a-8df3-2db339c562ed-os-release\") pod \"multus-additional-cni-plugins-mt7bq\" (UID: \"03ef49f1-0c6a-443a-8df3-2db339c562ed\") " pod="openshift-multus/multus-additional-cni-plugins-mt7bq" Mar 16 00:08:31 crc kubenswrapper[4816]: I0316 00:08:31.593279 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/dd08ece2-7636-4966-973a-e96a34b70b53-rootfs\") pod \"machine-config-daemon-jrdcz\" (UID: \"dd08ece2-7636-4966-973a-e96a34b70b53\") " pod="openshift-machine-config-operator/machine-config-daemon-jrdcz" Mar 16 00:08:31 crc kubenswrapper[4816]: I0316 00:08:31.593295 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/e9789e58-12c8-4831-9401-af48a3e92209-hostroot\") pod \"multus-szscw\" (UID: \"e9789e58-12c8-4831-9401-af48a3e92209\") " pod="openshift-multus/multus-szscw" Mar 16 00:08:31 crc kubenswrapper[4816]: I0316 00:08:31.593311 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/03ef49f1-0c6a-443a-8df3-2db339c562ed-cni-binary-copy\") pod \"multus-additional-cni-plugins-mt7bq\" (UID: \"03ef49f1-0c6a-443a-8df3-2db339c562ed\") " pod="openshift-multus/multus-additional-cni-plugins-mt7bq" Mar 16 00:08:31 crc kubenswrapper[4816]: I0316 00:08:31.593360 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/03ef49f1-0c6a-443a-8df3-2db339c562ed-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-mt7bq\" (UID: \"03ef49f1-0c6a-443a-8df3-2db339c562ed\") " pod="openshift-multus/multus-additional-cni-plugins-mt7bq" Mar 16 00:08:31 crc kubenswrapper[4816]: I0316 00:08:31.593389 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/dd08ece2-7636-4966-973a-e96a34b70b53-proxy-tls\") pod \"machine-config-daemon-jrdcz\" (UID: \"dd08ece2-7636-4966-973a-e96a34b70b53\") " pod="openshift-machine-config-operator/machine-config-daemon-jrdcz" Mar 16 00:08:31 crc kubenswrapper[4816]: I0316 00:08:31.593434 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mxf6x\" (UniqueName: \"kubernetes.io/projected/e9789e58-12c8-4831-9401-af48a3e92209-kube-api-access-mxf6x\") pod \"multus-szscw\" (UID: \"e9789e58-12c8-4831-9401-af48a3e92209\") " pod="openshift-multus/multus-szscw" Mar 16 00:08:31 crc kubenswrapper[4816]: I0316 00:08:31.593456 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/03ef49f1-0c6a-443a-8df3-2db339c562ed-system-cni-dir\") pod \"multus-additional-cni-plugins-mt7bq\" (UID: \"03ef49f1-0c6a-443a-8df3-2db339c562ed\") " pod="openshift-multus/multus-additional-cni-plugins-mt7bq" Mar 16 00:08:31 crc kubenswrapper[4816]: I0316 00:08:31.593480 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/e9789e58-12c8-4831-9401-af48a3e92209-system-cni-dir\") pod \"multus-szscw\" (UID: \"e9789e58-12c8-4831-9401-af48a3e92209\") " pod="openshift-multus/multus-szscw" Mar 16 00:08:31 crc kubenswrapper[4816]: I0316 00:08:31.593531 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/e9789e58-12c8-4831-9401-af48a3e92209-multus-cni-dir\") pod \"multus-szscw\" (UID: \"e9789e58-12c8-4831-9401-af48a3e92209\") " pod="openshift-multus/multus-szscw" Mar 16 00:08:31 crc kubenswrapper[4816]: I0316 00:08:31.593588 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/e9789e58-12c8-4831-9401-af48a3e92209-os-release\") pod \"multus-szscw\" (UID: \"e9789e58-12c8-4831-9401-af48a3e92209\") " pod="openshift-multus/multus-szscw" Mar 16 00:08:31 crc kubenswrapper[4816]: I0316 00:08:31.593615 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/e9789e58-12c8-4831-9401-af48a3e92209-host-run-k8s-cni-cncf-io\") pod \"multus-szscw\" (UID: \"e9789e58-12c8-4831-9401-af48a3e92209\") " pod="openshift-multus/multus-szscw" Mar 16 00:08:31 crc kubenswrapper[4816]: I0316 00:08:31.593676 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/e9789e58-12c8-4831-9401-af48a3e92209-host-run-multus-certs\") pod \"multus-szscw\" (UID: \"e9789e58-12c8-4831-9401-af48a3e92209\") " pod="openshift-multus/multus-szscw" Mar 16 00:08:31 crc kubenswrapper[4816]: I0316 00:08:31.593745 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/e9789e58-12c8-4831-9401-af48a3e92209-cnibin\") pod \"multus-szscw\" (UID: \"e9789e58-12c8-4831-9401-af48a3e92209\") " pod="openshift-multus/multus-szscw" Mar 16 00:08:31 crc kubenswrapper[4816]: I0316 00:08:31.593813 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/e9789e58-12c8-4831-9401-af48a3e92209-cni-binary-copy\") pod \"multus-szscw\" (UID: \"e9789e58-12c8-4831-9401-af48a3e92209\") " pod="openshift-multus/multus-szscw" Mar 16 00:08:31 crc kubenswrapper[4816]: I0316 00:08:31.594630 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c8786d1-025d-4788-bafe-c2c2eaf8e398\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://46db23088e8ba791d736f70a65bd066af80a5ec27c31332d9ac9c52530b21bfd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://da7b257af12b9ee605dc32a26d9565f866a9cafebb4936a0bde7f42ae9909f80\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2523ea43f2490afb81354be8a2840f54a3be1a8d3154196e0c8ac365d09723a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://29c4319e41bd025417b11cb87b682a66698ba7575801f247b1192dc8067ab66f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://29c4319e41bd025417b11cb87b682a66698ba7575801f247b1192dc8067ab66f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-16T00:07:59Z\\\",\\\"message\\\":\\\"le observer\\\\nW0316 00:07:59.373922 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0316 00:07:59.374075 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0316 00:07:59.374860 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2696044728/tls.crt::/tmp/serving-cert-2696044728/tls.key\\\\\\\"\\\\nI0316 00:07:59.560928 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0316 00:07:59.563996 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0316 00:07:59.564013 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0316 00:07:59.564035 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0316 00:07:59.564041 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0316 00:07:59.571475 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0316 00:07:59.571523 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0316 00:07:59.571531 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0316 00:07:59.571540 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0316 00:07:59.571574 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0316 00:07:59.571588 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nI0316 00:07:59.571493 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0316 00:07:59.571597 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0316 00:07:59.573484 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-16T00:07:58Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c81d0c40be9c3912864280cd98481f837ef29f69d85599b39decf5e9d7a97eff\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:50Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e39d816de31fa1c42a254ec61527496e7e85121f67007a774524167b1063cf1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e39d816de31fa1c42a254ec61527496e7e85121f67007a774524167b1063cf1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:06:47Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:31Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:31 crc kubenswrapper[4816]: I0316 00:08:31.610626 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://516852108b3d17a15676573ff080d3d94fa48d8076ed0404a8d827e715c2555e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:31Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:31 crc kubenswrapper[4816]: I0316 00:08:31.624054 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:31Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:31 crc kubenswrapper[4816]: I0316 00:08:31.639695 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-mt7bq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"03ef49f1-0c6a-443a-8df3-2db339c562ed\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:08:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-mt7bq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:31Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:31 crc kubenswrapper[4816]: I0316 00:08:31.655229 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:31Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:31 crc kubenswrapper[4816]: I0316 00:08:31.671398 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:31Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:31 crc kubenswrapper[4816]: I0316 00:08:31.676816 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:31 crc kubenswrapper[4816]: I0316 00:08:31.676844 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:31 crc kubenswrapper[4816]: I0316 00:08:31.676855 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:31 crc kubenswrapper[4816]: I0316 00:08:31.676869 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:31 crc kubenswrapper[4816]: I0316 00:08:31.676878 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:31Z","lastTransitionTime":"2026-03-16T00:08:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:31 crc kubenswrapper[4816]: I0316 00:08:31.695000 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/03ef49f1-0c6a-443a-8df3-2db339c562ed-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-mt7bq\" (UID: \"03ef49f1-0c6a-443a-8df3-2db339c562ed\") " pod="openshift-multus/multus-additional-cni-plugins-mt7bq" Mar 16 00:08:31 crc kubenswrapper[4816]: I0316 00:08:31.695064 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/dd08ece2-7636-4966-973a-e96a34b70b53-proxy-tls\") pod \"machine-config-daemon-jrdcz\" (UID: \"dd08ece2-7636-4966-973a-e96a34b70b53\") " pod="openshift-machine-config-operator/machine-config-daemon-jrdcz" Mar 16 00:08:31 crc kubenswrapper[4816]: I0316 00:08:31.695094 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mxf6x\" (UniqueName: \"kubernetes.io/projected/e9789e58-12c8-4831-9401-af48a3e92209-kube-api-access-mxf6x\") pod \"multus-szscw\" (UID: \"e9789e58-12c8-4831-9401-af48a3e92209\") " pod="openshift-multus/multus-szscw" Mar 16 00:08:31 crc kubenswrapper[4816]: I0316 00:08:31.695115 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/e9789e58-12c8-4831-9401-af48a3e92209-system-cni-dir\") pod \"multus-szscw\" (UID: \"e9789e58-12c8-4831-9401-af48a3e92209\") " pod="openshift-multus/multus-szscw" Mar 16 00:08:31 crc kubenswrapper[4816]: I0316 00:08:31.695136 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/e9789e58-12c8-4831-9401-af48a3e92209-multus-cni-dir\") pod \"multus-szscw\" (UID: \"e9789e58-12c8-4831-9401-af48a3e92209\") " pod="openshift-multus/multus-szscw" Mar 16 00:08:31 crc kubenswrapper[4816]: I0316 00:08:31.695155 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/03ef49f1-0c6a-443a-8df3-2db339c562ed-system-cni-dir\") pod \"multus-additional-cni-plugins-mt7bq\" (UID: \"03ef49f1-0c6a-443a-8df3-2db339c562ed\") " pod="openshift-multus/multus-additional-cni-plugins-mt7bq" Mar 16 00:08:31 crc kubenswrapper[4816]: I0316 00:08:31.695174 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/e9789e58-12c8-4831-9401-af48a3e92209-os-release\") pod \"multus-szscw\" (UID: \"e9789e58-12c8-4831-9401-af48a3e92209\") " pod="openshift-multus/multus-szscw" Mar 16 00:08:31 crc kubenswrapper[4816]: I0316 00:08:31.695193 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/e9789e58-12c8-4831-9401-af48a3e92209-host-run-k8s-cni-cncf-io\") pod \"multus-szscw\" (UID: \"e9789e58-12c8-4831-9401-af48a3e92209\") " pod="openshift-multus/multus-szscw" Mar 16 00:08:31 crc kubenswrapper[4816]: I0316 00:08:31.695210 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/e9789e58-12c8-4831-9401-af48a3e92209-host-run-multus-certs\") pod \"multus-szscw\" (UID: \"e9789e58-12c8-4831-9401-af48a3e92209\") " pod="openshift-multus/multus-szscw" Mar 16 00:08:31 crc kubenswrapper[4816]: I0316 00:08:31.695235 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/e9789e58-12c8-4831-9401-af48a3e92209-cnibin\") pod \"multus-szscw\" (UID: \"e9789e58-12c8-4831-9401-af48a3e92209\") " pod="openshift-multus/multus-szscw" Mar 16 00:08:31 crc kubenswrapper[4816]: I0316 00:08:31.695252 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/e9789e58-12c8-4831-9401-af48a3e92209-cni-binary-copy\") pod \"multus-szscw\" (UID: \"e9789e58-12c8-4831-9401-af48a3e92209\") " pod="openshift-multus/multus-szscw" Mar 16 00:08:31 crc kubenswrapper[4816]: I0316 00:08:31.695272 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/03ef49f1-0c6a-443a-8df3-2db339c562ed-tuning-conf-dir\") pod \"multus-additional-cni-plugins-mt7bq\" (UID: \"03ef49f1-0c6a-443a-8df3-2db339c562ed\") " pod="openshift-multus/multus-additional-cni-plugins-mt7bq" Mar 16 00:08:31 crc kubenswrapper[4816]: I0316 00:08:31.695290 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/e9789e58-12c8-4831-9401-af48a3e92209-host-run-netns\") pod \"multus-szscw\" (UID: \"e9789e58-12c8-4831-9401-af48a3e92209\") " pod="openshift-multus/multus-szscw" Mar 16 00:08:31 crc kubenswrapper[4816]: I0316 00:08:31.695314 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/e9789e58-12c8-4831-9401-af48a3e92209-host-var-lib-cni-bin\") pod \"multus-szscw\" (UID: \"e9789e58-12c8-4831-9401-af48a3e92209\") " pod="openshift-multus/multus-szscw" Mar 16 00:08:31 crc kubenswrapper[4816]: I0316 00:08:31.695341 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/03ef49f1-0c6a-443a-8df3-2db339c562ed-cnibin\") pod \"multus-additional-cni-plugins-mt7bq\" (UID: \"03ef49f1-0c6a-443a-8df3-2db339c562ed\") " pod="openshift-multus/multus-additional-cni-plugins-mt7bq" Mar 16 00:08:31 crc kubenswrapper[4816]: I0316 00:08:31.695364 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-trwbh\" (UniqueName: \"kubernetes.io/projected/dd08ece2-7636-4966-973a-e96a34b70b53-kube-api-access-trwbh\") pod \"machine-config-daemon-jrdcz\" (UID: \"dd08ece2-7636-4966-973a-e96a34b70b53\") " pod="openshift-machine-config-operator/machine-config-daemon-jrdcz" Mar 16 00:08:31 crc kubenswrapper[4816]: I0316 00:08:31.695388 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/e9789e58-12c8-4831-9401-af48a3e92209-host-var-lib-kubelet\") pod \"multus-szscw\" (UID: \"e9789e58-12c8-4831-9401-af48a3e92209\") " pod="openshift-multus/multus-szscw" Mar 16 00:08:31 crc kubenswrapper[4816]: I0316 00:08:31.695410 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e9789e58-12c8-4831-9401-af48a3e92209-etc-kubernetes\") pod \"multus-szscw\" (UID: \"e9789e58-12c8-4831-9401-af48a3e92209\") " pod="openshift-multus/multus-szscw" Mar 16 00:08:31 crc kubenswrapper[4816]: I0316 00:08:31.695439 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/dd08ece2-7636-4966-973a-e96a34b70b53-mcd-auth-proxy-config\") pod \"machine-config-daemon-jrdcz\" (UID: \"dd08ece2-7636-4966-973a-e96a34b70b53\") " pod="openshift-machine-config-operator/machine-config-daemon-jrdcz" Mar 16 00:08:31 crc kubenswrapper[4816]: I0316 00:08:31.695462 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/e9789e58-12c8-4831-9401-af48a3e92209-multus-socket-dir-parent\") pod \"multus-szscw\" (UID: \"e9789e58-12c8-4831-9401-af48a3e92209\") " pod="openshift-multus/multus-szscw" Mar 16 00:08:31 crc kubenswrapper[4816]: I0316 00:08:31.695487 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/e9789e58-12c8-4831-9401-af48a3e92209-host-var-lib-cni-multus\") pod \"multus-szscw\" (UID: \"e9789e58-12c8-4831-9401-af48a3e92209\") " pod="openshift-multus/multus-szscw" Mar 16 00:08:31 crc kubenswrapper[4816]: I0316 00:08:31.695520 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xfvdb\" (UniqueName: \"kubernetes.io/projected/03ef49f1-0c6a-443a-8df3-2db339c562ed-kube-api-access-xfvdb\") pod \"multus-additional-cni-plugins-mt7bq\" (UID: \"03ef49f1-0c6a-443a-8df3-2db339c562ed\") " pod="openshift-multus/multus-additional-cni-plugins-mt7bq" Mar 16 00:08:31 crc kubenswrapper[4816]: I0316 00:08:31.695577 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/e9789e58-12c8-4831-9401-af48a3e92209-multus-daemon-config\") pod \"multus-szscw\" (UID: \"e9789e58-12c8-4831-9401-af48a3e92209\") " pod="openshift-multus/multus-szscw" Mar 16 00:08:31 crc kubenswrapper[4816]: I0316 00:08:31.695573 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/e9789e58-12c8-4831-9401-af48a3e92209-host-run-netns\") pod \"multus-szscw\" (UID: \"e9789e58-12c8-4831-9401-af48a3e92209\") " pod="openshift-multus/multus-szscw" Mar 16 00:08:31 crc kubenswrapper[4816]: I0316 00:08:31.695611 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/e9789e58-12c8-4831-9401-af48a3e92209-host-run-k8s-cni-cncf-io\") pod \"multus-szscw\" (UID: \"e9789e58-12c8-4831-9401-af48a3e92209\") " pod="openshift-multus/multus-szscw" Mar 16 00:08:31 crc kubenswrapper[4816]: I0316 00:08:31.695676 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/e9789e58-12c8-4831-9401-af48a3e92209-host-run-multus-certs\") pod \"multus-szscw\" (UID: \"e9789e58-12c8-4831-9401-af48a3e92209\") " pod="openshift-multus/multus-szscw" Mar 16 00:08:31 crc kubenswrapper[4816]: I0316 00:08:31.695602 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/03ef49f1-0c6a-443a-8df3-2db339c562ed-os-release\") pod \"multus-additional-cni-plugins-mt7bq\" (UID: \"03ef49f1-0c6a-443a-8df3-2db339c562ed\") " pod="openshift-multus/multus-additional-cni-plugins-mt7bq" Mar 16 00:08:31 crc kubenswrapper[4816]: I0316 00:08:31.695704 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/03ef49f1-0c6a-443a-8df3-2db339c562ed-os-release\") pod \"multus-additional-cni-plugins-mt7bq\" (UID: \"03ef49f1-0c6a-443a-8df3-2db339c562ed\") " pod="openshift-multus/multus-additional-cni-plugins-mt7bq" Mar 16 00:08:31 crc kubenswrapper[4816]: I0316 00:08:31.695804 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/e9789e58-12c8-4831-9401-af48a3e92209-multus-conf-dir\") pod \"multus-szscw\" (UID: \"e9789e58-12c8-4831-9401-af48a3e92209\") " pod="openshift-multus/multus-szscw" Mar 16 00:08:31 crc kubenswrapper[4816]: I0316 00:08:31.695815 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/e9789e58-12c8-4831-9401-af48a3e92209-cnibin\") pod \"multus-szscw\" (UID: \"e9789e58-12c8-4831-9401-af48a3e92209\") " pod="openshift-multus/multus-szscw" Mar 16 00:08:31 crc kubenswrapper[4816]: I0316 00:08:31.695831 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/dd08ece2-7636-4966-973a-e96a34b70b53-rootfs\") pod \"machine-config-daemon-jrdcz\" (UID: \"dd08ece2-7636-4966-973a-e96a34b70b53\") " pod="openshift-machine-config-operator/machine-config-daemon-jrdcz" Mar 16 00:08:31 crc kubenswrapper[4816]: I0316 00:08:31.695856 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/e9789e58-12c8-4831-9401-af48a3e92209-hostroot\") pod \"multus-szscw\" (UID: \"e9789e58-12c8-4831-9401-af48a3e92209\") " pod="openshift-multus/multus-szscw" Mar 16 00:08:31 crc kubenswrapper[4816]: I0316 00:08:31.695878 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/03ef49f1-0c6a-443a-8df3-2db339c562ed-cni-binary-copy\") pod \"multus-additional-cni-plugins-mt7bq\" (UID: \"03ef49f1-0c6a-443a-8df3-2db339c562ed\") " pod="openshift-multus/multus-additional-cni-plugins-mt7bq" Mar 16 00:08:31 crc kubenswrapper[4816]: I0316 00:08:31.696325 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/e9789e58-12c8-4831-9401-af48a3e92209-host-var-lib-cni-bin\") pod \"multus-szscw\" (UID: \"e9789e58-12c8-4831-9401-af48a3e92209\") " pod="openshift-multus/multus-szscw" Mar 16 00:08:31 crc kubenswrapper[4816]: I0316 00:08:31.696375 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/03ef49f1-0c6a-443a-8df3-2db339c562ed-cnibin\") pod \"multus-additional-cni-plugins-mt7bq\" (UID: \"03ef49f1-0c6a-443a-8df3-2db339c562ed\") " pod="openshift-multus/multus-additional-cni-plugins-mt7bq" Mar 16 00:08:31 crc kubenswrapper[4816]: I0316 00:08:31.696717 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/e9789e58-12c8-4831-9401-af48a3e92209-host-var-lib-cni-multus\") pod \"multus-szscw\" (UID: \"e9789e58-12c8-4831-9401-af48a3e92209\") " pod="openshift-multus/multus-szscw" Mar 16 00:08:31 crc kubenswrapper[4816]: I0316 00:08:31.696765 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/e9789e58-12c8-4831-9401-af48a3e92209-host-var-lib-kubelet\") pod \"multus-szscw\" (UID: \"e9789e58-12c8-4831-9401-af48a3e92209\") " pod="openshift-multus/multus-szscw" Mar 16 00:08:31 crc kubenswrapper[4816]: I0316 00:08:31.696796 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e9789e58-12c8-4831-9401-af48a3e92209-etc-kubernetes\") pod \"multus-szscw\" (UID: \"e9789e58-12c8-4831-9401-af48a3e92209\") " pod="openshift-multus/multus-szscw" Mar 16 00:08:31 crc kubenswrapper[4816]: I0316 00:08:31.696789 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/03ef49f1-0c6a-443a-8df3-2db339c562ed-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-mt7bq\" (UID: \"03ef49f1-0c6a-443a-8df3-2db339c562ed\") " pod="openshift-multus/multus-additional-cni-plugins-mt7bq" Mar 16 00:08:31 crc kubenswrapper[4816]: I0316 00:08:31.696781 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/e9789e58-12c8-4831-9401-af48a3e92209-system-cni-dir\") pod \"multus-szscw\" (UID: \"e9789e58-12c8-4831-9401-af48a3e92209\") " pod="openshift-multus/multus-szscw" Mar 16 00:08:31 crc kubenswrapper[4816]: I0316 00:08:31.696882 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/dd08ece2-7636-4966-973a-e96a34b70b53-rootfs\") pod \"machine-config-daemon-jrdcz\" (UID: \"dd08ece2-7636-4966-973a-e96a34b70b53\") " pod="openshift-machine-config-operator/machine-config-daemon-jrdcz" Mar 16 00:08:31 crc kubenswrapper[4816]: I0316 00:08:31.696928 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/e9789e58-12c8-4831-9401-af48a3e92209-hostroot\") pod \"multus-szscw\" (UID: \"e9789e58-12c8-4831-9401-af48a3e92209\") " pod="openshift-multus/multus-szscw" Mar 16 00:08:31 crc kubenswrapper[4816]: I0316 00:08:31.696941 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/03ef49f1-0c6a-443a-8df3-2db339c562ed-system-cni-dir\") pod \"multus-additional-cni-plugins-mt7bq\" (UID: \"03ef49f1-0c6a-443a-8df3-2db339c562ed\") " pod="openshift-multus/multus-additional-cni-plugins-mt7bq" Mar 16 00:08:31 crc kubenswrapper[4816]: I0316 00:08:31.697030 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/e9789e58-12c8-4831-9401-af48a3e92209-multus-conf-dir\") pod \"multus-szscw\" (UID: \"e9789e58-12c8-4831-9401-af48a3e92209\") " pod="openshift-multus/multus-szscw" Mar 16 00:08:31 crc kubenswrapper[4816]: I0316 00:08:31.697067 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/e9789e58-12c8-4831-9401-af48a3e92209-os-release\") pod \"multus-szscw\" (UID: \"e9789e58-12c8-4831-9401-af48a3e92209\") " pod="openshift-multus/multus-szscw" Mar 16 00:08:31 crc kubenswrapper[4816]: I0316 00:08:31.697132 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/e9789e58-12c8-4831-9401-af48a3e92209-multus-cni-dir\") pod \"multus-szscw\" (UID: \"e9789e58-12c8-4831-9401-af48a3e92209\") " pod="openshift-multus/multus-szscw" Mar 16 00:08:31 crc kubenswrapper[4816]: I0316 00:08:31.697249 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/e9789e58-12c8-4831-9401-af48a3e92209-multus-socket-dir-parent\") pod \"multus-szscw\" (UID: \"e9789e58-12c8-4831-9401-af48a3e92209\") " pod="openshift-multus/multus-szscw" Mar 16 00:08:31 crc kubenswrapper[4816]: I0316 00:08:31.697285 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/03ef49f1-0c6a-443a-8df3-2db339c562ed-cni-binary-copy\") pod \"multus-additional-cni-plugins-mt7bq\" (UID: \"03ef49f1-0c6a-443a-8df3-2db339c562ed\") " pod="openshift-multus/multus-additional-cni-plugins-mt7bq" Mar 16 00:08:31 crc kubenswrapper[4816]: I0316 00:08:31.697623 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/e9789e58-12c8-4831-9401-af48a3e92209-cni-binary-copy\") pod \"multus-szscw\" (UID: \"e9789e58-12c8-4831-9401-af48a3e92209\") " pod="openshift-multus/multus-szscw" Mar 16 00:08:31 crc kubenswrapper[4816]: I0316 00:08:31.697623 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/dd08ece2-7636-4966-973a-e96a34b70b53-mcd-auth-proxy-config\") pod \"machine-config-daemon-jrdcz\" (UID: \"dd08ece2-7636-4966-973a-e96a34b70b53\") " pod="openshift-machine-config-operator/machine-config-daemon-jrdcz" Mar 16 00:08:31 crc kubenswrapper[4816]: I0316 00:08:31.697728 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/03ef49f1-0c6a-443a-8df3-2db339c562ed-tuning-conf-dir\") pod \"multus-additional-cni-plugins-mt7bq\" (UID: \"03ef49f1-0c6a-443a-8df3-2db339c562ed\") " pod="openshift-multus/multus-additional-cni-plugins-mt7bq" Mar 16 00:08:31 crc kubenswrapper[4816]: I0316 00:08:31.698002 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/e9789e58-12c8-4831-9401-af48a3e92209-multus-daemon-config\") pod \"multus-szscw\" (UID: \"e9789e58-12c8-4831-9401-af48a3e92209\") " pod="openshift-multus/multus-szscw" Mar 16 00:08:31 crc kubenswrapper[4816]: I0316 00:08:31.700532 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/dd08ece2-7636-4966-973a-e96a34b70b53-proxy-tls\") pod \"machine-config-daemon-jrdcz\" (UID: \"dd08ece2-7636-4966-973a-e96a34b70b53\") " pod="openshift-machine-config-operator/machine-config-daemon-jrdcz" Mar 16 00:08:31 crc kubenswrapper[4816]: I0316 00:08:31.702009 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"49e35ef6-7a6d-438f-b598-4445d73db379\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fbe90b49c5d42bb9d6d5d1b8f1d02b571403dd3f39d3118081351d72c4910914\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://508e6dbe6f2f1392d16501e09ee2ec523a3c2a245cd96c6ab773e26e83b8bb6a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://15b5a3223ff0fb5b96e5b87ee836add66498165759f4d6ed8cb6413f091967e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://603bdcbd4a6af5fda7ca09e4823e0db8d7ec3e3eb38eddf9f062a14b433cb094\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8b061f16a65a27b9a5ef66231376b70fec084479a991b7fdd430957df76da32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b19c46be8bb0c6604bb4f702d14472b782280ce716a3b5fe87a4fa4f1b15b174\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b19c46be8bb0c6604bb4f702d14472b782280ce716a3b5fe87a4fa4f1b15b174\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://baceacd0418c0bbcbd2e875008286443b263336ba956d8e4584147ed949b3f69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://baceacd0418c0bbcbd2e875008286443b263336ba956d8e4584147ed949b3f69\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:49Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://24e7adc2ddce8a281cd270db47b3528d2a4965ab6d41a14ecb5aaea02daf0c45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://24e7adc2ddce8a281cd270db47b3528d2a4965ab6d41a14ecb5aaea02daf0c45\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:06:47Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:31Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:31 crc kubenswrapper[4816]: I0316 00:08:31.712643 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-trwbh\" (UniqueName: \"kubernetes.io/projected/dd08ece2-7636-4966-973a-e96a34b70b53-kube-api-access-trwbh\") pod \"machine-config-daemon-jrdcz\" (UID: \"dd08ece2-7636-4966-973a-e96a34b70b53\") " pod="openshift-machine-config-operator/machine-config-daemon-jrdcz" Mar 16 00:08:31 crc kubenswrapper[4816]: I0316 00:08:31.713869 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-szscw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e9789e58-12c8-4831-9401-af48a3e92209\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mxf6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:08:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-szscw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:31Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:31 crc kubenswrapper[4816]: I0316 00:08:31.718358 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xfvdb\" (UniqueName: \"kubernetes.io/projected/03ef49f1-0c6a-443a-8df3-2db339c562ed-kube-api-access-xfvdb\") pod \"multus-additional-cni-plugins-mt7bq\" (UID: \"03ef49f1-0c6a-443a-8df3-2db339c562ed\") " pod="openshift-multus/multus-additional-cni-plugins-mt7bq" Mar 16 00:08:31 crc kubenswrapper[4816]: I0316 00:08:31.719743 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mxf6x\" (UniqueName: \"kubernetes.io/projected/e9789e58-12c8-4831-9401-af48a3e92209-kube-api-access-mxf6x\") pod \"multus-szscw\" (UID: \"e9789e58-12c8-4831-9401-af48a3e92209\") " pod="openshift-multus/multus-szscw" Mar 16 00:08:31 crc kubenswrapper[4816]: I0316 00:08:31.728430 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://37869baff974556c22aadadfa4b528b5d6b9ad236107b4dbc945b3cdbf743f20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f9fe727708fdaca68596b7d305629ade45b061a88a81085f885d490ab99e24b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:31Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:31 crc kubenswrapper[4816]: I0316 00:08:31.738907 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-cnhkf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e686cd4-bddf-463e-b471-e49ea862691e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bwzt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:08:31Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-cnhkf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:31Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:31 crc kubenswrapper[4816]: I0316 00:08:31.751441 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://723654bc91ce4d40a27b14a9c4df1ed6c8dff2517f378927d7a6a96aeaa6951b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:31Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:31 crc kubenswrapper[4816]: I0316 00:08:31.761965 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:31Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:31 crc kubenswrapper[4816]: I0316 00:08:31.774529 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-mt7bq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"03ef49f1-0c6a-443a-8df3-2db339c562ed\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:08:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-mt7bq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:31Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:31 crc kubenswrapper[4816]: I0316 00:08:31.778742 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:31 crc kubenswrapper[4816]: I0316 00:08:31.778778 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:31 crc kubenswrapper[4816]: I0316 00:08:31.778792 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:31 crc kubenswrapper[4816]: I0316 00:08:31.778813 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:31 crc kubenswrapper[4816]: I0316 00:08:31.778826 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:31Z","lastTransitionTime":"2026-03-16T00:08:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:31 crc kubenswrapper[4816]: I0316 00:08:31.788177 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c8786d1-025d-4788-bafe-c2c2eaf8e398\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://46db23088e8ba791d736f70a65bd066af80a5ec27c31332d9ac9c52530b21bfd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://da7b257af12b9ee605dc32a26d9565f866a9cafebb4936a0bde7f42ae9909f80\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2523ea43f2490afb81354be8a2840f54a3be1a8d3154196e0c8ac365d09723a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://29c4319e41bd025417b11cb87b682a66698ba7575801f247b1192dc8067ab66f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://29c4319e41bd025417b11cb87b682a66698ba7575801f247b1192dc8067ab66f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-16T00:07:59Z\\\",\\\"message\\\":\\\"le observer\\\\nW0316 00:07:59.373922 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0316 00:07:59.374075 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0316 00:07:59.374860 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2696044728/tls.crt::/tmp/serving-cert-2696044728/tls.key\\\\\\\"\\\\nI0316 00:07:59.560928 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0316 00:07:59.563996 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0316 00:07:59.564013 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0316 00:07:59.564035 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0316 00:07:59.564041 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0316 00:07:59.571475 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0316 00:07:59.571523 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0316 00:07:59.571531 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0316 00:07:59.571540 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0316 00:07:59.571574 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0316 00:07:59.571588 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nI0316 00:07:59.571493 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0316 00:07:59.571597 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0316 00:07:59.573484 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-16T00:07:58Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c81d0c40be9c3912864280cd98481f837ef29f69d85599b39decf5e9d7a97eff\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:50Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e39d816de31fa1c42a254ec61527496e7e85121f67007a774524167b1063cf1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e39d816de31fa1c42a254ec61527496e7e85121f67007a774524167b1063cf1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:06:47Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:31Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:31 crc kubenswrapper[4816]: I0316 00:08:31.804174 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://516852108b3d17a15676573ff080d3d94fa48d8076ed0404a8d827e715c2555e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:31Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:31 crc kubenswrapper[4816]: I0316 00:08:31.818896 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-szscw" Mar 16 00:08:31 crc kubenswrapper[4816]: I0316 00:08:31.818897 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:31Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:31 crc kubenswrapper[4816]: W0316 00:08:31.831139 4816 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode9789e58_12c8_4831_9401_af48a3e92209.slice/crio-d14e6099281a3f4a5c54bb47b271be878d39affb726d275c2082db2728e837cc WatchSource:0}: Error finding container d14e6099281a3f4a5c54bb47b271be878d39affb726d275c2082db2728e837cc: Status 404 returned error can't find the container with id d14e6099281a3f4a5c54bb47b271be878d39affb726d275c2082db2728e837cc Mar 16 00:08:31 crc kubenswrapper[4816]: I0316 00:08:31.835627 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:31Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:31 crc kubenswrapper[4816]: I0316 00:08:31.853318 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-mt7bq" Mar 16 00:08:31 crc kubenswrapper[4816]: I0316 00:08:31.854419 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-jrdcz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dd08ece2-7636-4966-973a-e96a34b70b53\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trwbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trwbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:08:31Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-jrdcz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:31Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:31 crc kubenswrapper[4816]: I0316 00:08:31.862089 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-jrdcz" Mar 16 00:08:31 crc kubenswrapper[4816]: W0316 00:08:31.869368 4816 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod03ef49f1_0c6a_443a_8df3_2db339c562ed.slice/crio-2322d7ca26005146ca00985bb21a70c27a081273e8ba13a44e439ebce554fff7 WatchSource:0}: Error finding container 2322d7ca26005146ca00985bb21a70c27a081273e8ba13a44e439ebce554fff7: Status 404 returned error can't find the container with id 2322d7ca26005146ca00985bb21a70c27a081273e8ba13a44e439ebce554fff7 Mar 16 00:08:31 crc kubenswrapper[4816]: I0316 00:08:31.877238 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"49e35ef6-7a6d-438f-b598-4445d73db379\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fbe90b49c5d42bb9d6d5d1b8f1d02b571403dd3f39d3118081351d72c4910914\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://508e6dbe6f2f1392d16501e09ee2ec523a3c2a245cd96c6ab773e26e83b8bb6a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://15b5a3223ff0fb5b96e5b87ee836add66498165759f4d6ed8cb6413f091967e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://603bdcbd4a6af5fda7ca09e4823e0db8d7ec3e3eb38eddf9f062a14b433cb094\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8b061f16a65a27b9a5ef66231376b70fec084479a991b7fdd430957df76da32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b19c46be8bb0c6604bb4f702d14472b782280ce716a3b5fe87a4fa4f1b15b174\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b19c46be8bb0c6604bb4f702d14472b782280ce716a3b5fe87a4fa4f1b15b174\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://baceacd0418c0bbcbd2e875008286443b263336ba956d8e4584147ed949b3f69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://baceacd0418c0bbcbd2e875008286443b263336ba956d8e4584147ed949b3f69\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:49Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://24e7adc2ddce8a281cd270db47b3528d2a4965ab6d41a14ecb5aaea02daf0c45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://24e7adc2ddce8a281cd270db47b3528d2a4965ab6d41a14ecb5aaea02daf0c45\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:06:47Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:31Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:31 crc kubenswrapper[4816]: I0316 00:08:31.880931 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:31 crc kubenswrapper[4816]: I0316 00:08:31.880961 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:31 crc kubenswrapper[4816]: I0316 00:08:31.880975 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:31 crc kubenswrapper[4816]: I0316 00:08:31.880993 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:31 crc kubenswrapper[4816]: I0316 00:08:31.881006 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:31Z","lastTransitionTime":"2026-03-16T00:08:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:31 crc kubenswrapper[4816]: I0316 00:08:31.890131 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-psjs7"] Mar 16 00:08:31 crc kubenswrapper[4816]: I0316 00:08:31.891347 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-psjs7" Mar 16 00:08:31 crc kubenswrapper[4816]: I0316 00:08:31.894808 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Mar 16 00:08:31 crc kubenswrapper[4816]: I0316 00:08:31.894964 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Mar 16 00:08:31 crc kubenswrapper[4816]: I0316 00:08:31.895033 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Mar 16 00:08:31 crc kubenswrapper[4816]: I0316 00:08:31.895466 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Mar 16 00:08:31 crc kubenswrapper[4816]: I0316 00:08:31.895745 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Mar 16 00:08:31 crc kubenswrapper[4816]: I0316 00:08:31.895764 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Mar 16 00:08:31 crc kubenswrapper[4816]: I0316 00:08:31.895983 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Mar 16 00:08:31 crc kubenswrapper[4816]: W0316 00:08:31.897885 4816 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddd08ece2_7636_4966_973a_e96a34b70b53.slice/crio-2f008ed10596c37892fa68d6a991ef4c4c25f62429883230c3721018781ad8a8 WatchSource:0}: Error finding container 2f008ed10596c37892fa68d6a991ef4c4c25f62429883230c3721018781ad8a8: Status 404 returned error can't find the container with id 2f008ed10596c37892fa68d6a991ef4c4c25f62429883230c3721018781ad8a8 Mar 16 00:08:31 crc kubenswrapper[4816]: I0316 00:08:31.902681 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-szscw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e9789e58-12c8-4831-9401-af48a3e92209\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mxf6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:08:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-szscw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:31Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:31 crc kubenswrapper[4816]: I0316 00:08:31.919943 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:31Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:31 crc kubenswrapper[4816]: I0316 00:08:31.944467 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:31Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:31 crc kubenswrapper[4816]: I0316 00:08:31.965317 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-jrdcz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dd08ece2-7636-4966-973a-e96a34b70b53\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trwbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trwbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:08:31Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-jrdcz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:31Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:31 crc kubenswrapper[4816]: I0316 00:08:31.984298 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:31 crc kubenswrapper[4816]: I0316 00:08:31.984339 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:31 crc kubenswrapper[4816]: I0316 00:08:31.984348 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:31 crc kubenswrapper[4816]: I0316 00:08:31.984366 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:31 crc kubenswrapper[4816]: I0316 00:08:31.984378 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:31Z","lastTransitionTime":"2026-03-16T00:08:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:31 crc kubenswrapper[4816]: I0316 00:08:31.995408 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-psjs7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:08:31Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-psjs7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:31Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:31 crc kubenswrapper[4816]: I0316 00:08:31.998743 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88-var-lib-openvswitch\") pod \"ovnkube-node-psjs7\" (UID: \"2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88\") " pod="openshift-ovn-kubernetes/ovnkube-node-psjs7" Mar 16 00:08:31 crc kubenswrapper[4816]: I0316 00:08:31.998807 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-psjs7\" (UID: \"2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88\") " pod="openshift-ovn-kubernetes/ovnkube-node-psjs7" Mar 16 00:08:31 crc kubenswrapper[4816]: I0316 00:08:31.998836 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88-systemd-units\") pod \"ovnkube-node-psjs7\" (UID: \"2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88\") " pod="openshift-ovn-kubernetes/ovnkube-node-psjs7" Mar 16 00:08:31 crc kubenswrapper[4816]: I0316 00:08:31.998865 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88-run-ovn\") pod \"ovnkube-node-psjs7\" (UID: \"2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88\") " pod="openshift-ovn-kubernetes/ovnkube-node-psjs7" Mar 16 00:08:31 crc kubenswrapper[4816]: I0316 00:08:31.998886 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88-node-log\") pod \"ovnkube-node-psjs7\" (UID: \"2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88\") " pod="openshift-ovn-kubernetes/ovnkube-node-psjs7" Mar 16 00:08:31 crc kubenswrapper[4816]: I0316 00:08:31.998910 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88-host-cni-netd\") pod \"ovnkube-node-psjs7\" (UID: \"2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88\") " pod="openshift-ovn-kubernetes/ovnkube-node-psjs7" Mar 16 00:08:31 crc kubenswrapper[4816]: I0316 00:08:31.998934 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88-ovnkube-config\") pod \"ovnkube-node-psjs7\" (UID: \"2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88\") " pod="openshift-ovn-kubernetes/ovnkube-node-psjs7" Mar 16 00:08:31 crc kubenswrapper[4816]: I0316 00:08:31.998996 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88-host-cni-bin\") pod \"ovnkube-node-psjs7\" (UID: \"2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88\") " pod="openshift-ovn-kubernetes/ovnkube-node-psjs7" Mar 16 00:08:31 crc kubenswrapper[4816]: I0316 00:08:31.999040 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88-host-run-ovn-kubernetes\") pod \"ovnkube-node-psjs7\" (UID: \"2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88\") " pod="openshift-ovn-kubernetes/ovnkube-node-psjs7" Mar 16 00:08:31 crc kubenswrapper[4816]: I0316 00:08:31.999060 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88-env-overrides\") pod \"ovnkube-node-psjs7\" (UID: \"2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88\") " pod="openshift-ovn-kubernetes/ovnkube-node-psjs7" Mar 16 00:08:31 crc kubenswrapper[4816]: I0316 00:08:31.999081 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88-host-kubelet\") pod \"ovnkube-node-psjs7\" (UID: \"2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88\") " pod="openshift-ovn-kubernetes/ovnkube-node-psjs7" Mar 16 00:08:31 crc kubenswrapper[4816]: I0316 00:08:31.999109 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88-run-openvswitch\") pod \"ovnkube-node-psjs7\" (UID: \"2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88\") " pod="openshift-ovn-kubernetes/ovnkube-node-psjs7" Mar 16 00:08:31 crc kubenswrapper[4816]: I0316 00:08:31.999133 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88-run-systemd\") pod \"ovnkube-node-psjs7\" (UID: \"2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88\") " pod="openshift-ovn-kubernetes/ovnkube-node-psjs7" Mar 16 00:08:31 crc kubenswrapper[4816]: I0316 00:08:31.999167 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88-host-run-netns\") pod \"ovnkube-node-psjs7\" (UID: \"2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88\") " pod="openshift-ovn-kubernetes/ovnkube-node-psjs7" Mar 16 00:08:31 crc kubenswrapper[4816]: I0316 00:08:31.999196 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88-ovn-node-metrics-cert\") pod \"ovnkube-node-psjs7\" (UID: \"2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88\") " pod="openshift-ovn-kubernetes/ovnkube-node-psjs7" Mar 16 00:08:31 crc kubenswrapper[4816]: I0316 00:08:31.999308 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88-ovnkube-script-lib\") pod \"ovnkube-node-psjs7\" (UID: \"2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88\") " pod="openshift-ovn-kubernetes/ovnkube-node-psjs7" Mar 16 00:08:31 crc kubenswrapper[4816]: I0316 00:08:31.999439 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88-etc-openvswitch\") pod \"ovnkube-node-psjs7\" (UID: \"2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88\") " pod="openshift-ovn-kubernetes/ovnkube-node-psjs7" Mar 16 00:08:32 crc kubenswrapper[4816]: I0316 00:08:31.999542 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88-host-slash\") pod \"ovnkube-node-psjs7\" (UID: \"2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88\") " pod="openshift-ovn-kubernetes/ovnkube-node-psjs7" Mar 16 00:08:32 crc kubenswrapper[4816]: I0316 00:08:31.999582 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88-log-socket\") pod \"ovnkube-node-psjs7\" (UID: \"2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88\") " pod="openshift-ovn-kubernetes/ovnkube-node-psjs7" Mar 16 00:08:32 crc kubenswrapper[4816]: I0316 00:08:31.999599 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9rd68\" (UniqueName: \"kubernetes.io/projected/2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88-kube-api-access-9rd68\") pod \"ovnkube-node-psjs7\" (UID: \"2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88\") " pod="openshift-ovn-kubernetes/ovnkube-node-psjs7" Mar 16 00:08:32 crc kubenswrapper[4816]: I0316 00:08:32.018269 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"49e35ef6-7a6d-438f-b598-4445d73db379\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fbe90b49c5d42bb9d6d5d1b8f1d02b571403dd3f39d3118081351d72c4910914\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://508e6dbe6f2f1392d16501e09ee2ec523a3c2a245cd96c6ab773e26e83b8bb6a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://15b5a3223ff0fb5b96e5b87ee836add66498165759f4d6ed8cb6413f091967e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://603bdcbd4a6af5fda7ca09e4823e0db8d7ec3e3eb38eddf9f062a14b433cb094\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8b061f16a65a27b9a5ef66231376b70fec084479a991b7fdd430957df76da32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b19c46be8bb0c6604bb4f702d14472b782280ce716a3b5fe87a4fa4f1b15b174\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b19c46be8bb0c6604bb4f702d14472b782280ce716a3b5fe87a4fa4f1b15b174\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://baceacd0418c0bbcbd2e875008286443b263336ba956d8e4584147ed949b3f69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://baceacd0418c0bbcbd2e875008286443b263336ba956d8e4584147ed949b3f69\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:49Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://24e7adc2ddce8a281cd270db47b3528d2a4965ab6d41a14ecb5aaea02daf0c45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://24e7adc2ddce8a281cd270db47b3528d2a4965ab6d41a14ecb5aaea02daf0c45\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:06:47Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:32Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:32 crc kubenswrapper[4816]: I0316 00:08:32.035635 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-szscw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e9789e58-12c8-4831-9401-af48a3e92209\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mxf6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:08:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-szscw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:32Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:32 crc kubenswrapper[4816]: I0316 00:08:32.051282 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://37869baff974556c22aadadfa4b528b5d6b9ad236107b4dbc945b3cdbf743f20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f9fe727708fdaca68596b7d305629ade45b061a88a81085f885d490ab99e24b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:32Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:32 crc kubenswrapper[4816]: I0316 00:08:32.065290 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-cnhkf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e686cd4-bddf-463e-b471-e49ea862691e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bwzt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:08:31Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-cnhkf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:32Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:32 crc kubenswrapper[4816]: I0316 00:08:32.080464 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://723654bc91ce4d40a27b14a9c4df1ed6c8dff2517f378927d7a6a96aeaa6951b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:32Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:32 crc kubenswrapper[4816]: I0316 00:08:32.092193 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:32 crc kubenswrapper[4816]: I0316 00:08:32.092247 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:32 crc kubenswrapper[4816]: I0316 00:08:32.092265 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:32 crc kubenswrapper[4816]: I0316 00:08:32.092293 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:32 crc kubenswrapper[4816]: I0316 00:08:32.092313 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:32Z","lastTransitionTime":"2026-03-16T00:08:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:32 crc kubenswrapper[4816]: I0316 00:08:32.093906 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:32Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:32 crc kubenswrapper[4816]: I0316 00:08:32.100356 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-psjs7\" (UID: \"2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88\") " pod="openshift-ovn-kubernetes/ovnkube-node-psjs7" Mar 16 00:08:32 crc kubenswrapper[4816]: I0316 00:08:32.100438 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-psjs7\" (UID: \"2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88\") " pod="openshift-ovn-kubernetes/ovnkube-node-psjs7" Mar 16 00:08:32 crc kubenswrapper[4816]: I0316 00:08:32.100473 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88-systemd-units\") pod \"ovnkube-node-psjs7\" (UID: \"2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88\") " pod="openshift-ovn-kubernetes/ovnkube-node-psjs7" Mar 16 00:08:32 crc kubenswrapper[4816]: I0316 00:08:32.100495 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88-run-ovn\") pod \"ovnkube-node-psjs7\" (UID: \"2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88\") " pod="openshift-ovn-kubernetes/ovnkube-node-psjs7" Mar 16 00:08:32 crc kubenswrapper[4816]: I0316 00:08:32.100508 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88-node-log\") pod \"ovnkube-node-psjs7\" (UID: \"2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88\") " pod="openshift-ovn-kubernetes/ovnkube-node-psjs7" Mar 16 00:08:32 crc kubenswrapper[4816]: I0316 00:08:32.100542 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88-systemd-units\") pod \"ovnkube-node-psjs7\" (UID: \"2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88\") " pod="openshift-ovn-kubernetes/ovnkube-node-psjs7" Mar 16 00:08:32 crc kubenswrapper[4816]: I0316 00:08:32.100586 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88-run-ovn\") pod \"ovnkube-node-psjs7\" (UID: \"2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88\") " pod="openshift-ovn-kubernetes/ovnkube-node-psjs7" Mar 16 00:08:32 crc kubenswrapper[4816]: I0316 00:08:32.100610 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88-ovnkube-config\") pod \"ovnkube-node-psjs7\" (UID: \"2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88\") " pod="openshift-ovn-kubernetes/ovnkube-node-psjs7" Mar 16 00:08:32 crc kubenswrapper[4816]: I0316 00:08:32.100638 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88-host-cni-bin\") pod \"ovnkube-node-psjs7\" (UID: \"2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88\") " pod="openshift-ovn-kubernetes/ovnkube-node-psjs7" Mar 16 00:08:32 crc kubenswrapper[4816]: I0316 00:08:32.100673 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88-node-log\") pod \"ovnkube-node-psjs7\" (UID: \"2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88\") " pod="openshift-ovn-kubernetes/ovnkube-node-psjs7" Mar 16 00:08:32 crc kubenswrapper[4816]: I0316 00:08:32.101318 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88-host-cni-bin\") pod \"ovnkube-node-psjs7\" (UID: \"2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88\") " pod="openshift-ovn-kubernetes/ovnkube-node-psjs7" Mar 16 00:08:32 crc kubenswrapper[4816]: I0316 00:08:32.101430 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88-ovnkube-config\") pod \"ovnkube-node-psjs7\" (UID: \"2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88\") " pod="openshift-ovn-kubernetes/ovnkube-node-psjs7" Mar 16 00:08:32 crc kubenswrapper[4816]: I0316 00:08:32.101500 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88-host-cni-netd\") pod \"ovnkube-node-psjs7\" (UID: \"2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88\") " pod="openshift-ovn-kubernetes/ovnkube-node-psjs7" Mar 16 00:08:32 crc kubenswrapper[4816]: I0316 00:08:32.101531 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88-host-run-ovn-kubernetes\") pod \"ovnkube-node-psjs7\" (UID: \"2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88\") " pod="openshift-ovn-kubernetes/ovnkube-node-psjs7" Mar 16 00:08:32 crc kubenswrapper[4816]: I0316 00:08:32.101610 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88-host-kubelet\") pod \"ovnkube-node-psjs7\" (UID: \"2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88\") " pod="openshift-ovn-kubernetes/ovnkube-node-psjs7" Mar 16 00:08:32 crc kubenswrapper[4816]: I0316 00:08:32.101665 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88-host-kubelet\") pod \"ovnkube-node-psjs7\" (UID: \"2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88\") " pod="openshift-ovn-kubernetes/ovnkube-node-psjs7" Mar 16 00:08:32 crc kubenswrapper[4816]: I0316 00:08:32.101656 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88-host-cni-netd\") pod \"ovnkube-node-psjs7\" (UID: \"2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88\") " pod="openshift-ovn-kubernetes/ovnkube-node-psjs7" Mar 16 00:08:32 crc kubenswrapper[4816]: I0316 00:08:32.101692 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88-env-overrides\") pod \"ovnkube-node-psjs7\" (UID: \"2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88\") " pod="openshift-ovn-kubernetes/ovnkube-node-psjs7" Mar 16 00:08:32 crc kubenswrapper[4816]: I0316 00:08:32.101745 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88-host-run-ovn-kubernetes\") pod \"ovnkube-node-psjs7\" (UID: \"2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88\") " pod="openshift-ovn-kubernetes/ovnkube-node-psjs7" Mar 16 00:08:32 crc kubenswrapper[4816]: I0316 00:08:32.101779 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88-run-openvswitch\") pod \"ovnkube-node-psjs7\" (UID: \"2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88\") " pod="openshift-ovn-kubernetes/ovnkube-node-psjs7" Mar 16 00:08:32 crc kubenswrapper[4816]: I0316 00:08:32.101937 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88-run-systemd\") pod \"ovnkube-node-psjs7\" (UID: \"2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88\") " pod="openshift-ovn-kubernetes/ovnkube-node-psjs7" Mar 16 00:08:32 crc kubenswrapper[4816]: I0316 00:08:32.101800 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88-run-openvswitch\") pod \"ovnkube-node-psjs7\" (UID: \"2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88\") " pod="openshift-ovn-kubernetes/ovnkube-node-psjs7" Mar 16 00:08:32 crc kubenswrapper[4816]: I0316 00:08:32.101984 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88-ovn-node-metrics-cert\") pod \"ovnkube-node-psjs7\" (UID: \"2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88\") " pod="openshift-ovn-kubernetes/ovnkube-node-psjs7" Mar 16 00:08:32 crc kubenswrapper[4816]: I0316 00:08:32.102032 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88-run-systemd\") pod \"ovnkube-node-psjs7\" (UID: \"2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88\") " pod="openshift-ovn-kubernetes/ovnkube-node-psjs7" Mar 16 00:08:32 crc kubenswrapper[4816]: I0316 00:08:32.102057 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88-ovnkube-script-lib\") pod \"ovnkube-node-psjs7\" (UID: \"2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88\") " pod="openshift-ovn-kubernetes/ovnkube-node-psjs7" Mar 16 00:08:32 crc kubenswrapper[4816]: I0316 00:08:32.102090 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88-host-run-netns\") pod \"ovnkube-node-psjs7\" (UID: \"2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88\") " pod="openshift-ovn-kubernetes/ovnkube-node-psjs7" Mar 16 00:08:32 crc kubenswrapper[4816]: I0316 00:08:32.102110 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88-host-slash\") pod \"ovnkube-node-psjs7\" (UID: \"2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88\") " pod="openshift-ovn-kubernetes/ovnkube-node-psjs7" Mar 16 00:08:32 crc kubenswrapper[4816]: I0316 00:08:32.102132 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88-etc-openvswitch\") pod \"ovnkube-node-psjs7\" (UID: \"2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88\") " pod="openshift-ovn-kubernetes/ovnkube-node-psjs7" Mar 16 00:08:32 crc kubenswrapper[4816]: I0316 00:08:32.102160 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88-log-socket\") pod \"ovnkube-node-psjs7\" (UID: \"2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88\") " pod="openshift-ovn-kubernetes/ovnkube-node-psjs7" Mar 16 00:08:32 crc kubenswrapper[4816]: I0316 00:08:32.102176 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88-host-run-netns\") pod \"ovnkube-node-psjs7\" (UID: \"2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88\") " pod="openshift-ovn-kubernetes/ovnkube-node-psjs7" Mar 16 00:08:32 crc kubenswrapper[4816]: I0316 00:08:32.102181 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9rd68\" (UniqueName: \"kubernetes.io/projected/2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88-kube-api-access-9rd68\") pod \"ovnkube-node-psjs7\" (UID: \"2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88\") " pod="openshift-ovn-kubernetes/ovnkube-node-psjs7" Mar 16 00:08:32 crc kubenswrapper[4816]: I0316 00:08:32.102250 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88-var-lib-openvswitch\") pod \"ovnkube-node-psjs7\" (UID: \"2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88\") " pod="openshift-ovn-kubernetes/ovnkube-node-psjs7" Mar 16 00:08:32 crc kubenswrapper[4816]: I0316 00:08:32.102329 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88-var-lib-openvswitch\") pod \"ovnkube-node-psjs7\" (UID: \"2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88\") " pod="openshift-ovn-kubernetes/ovnkube-node-psjs7" Mar 16 00:08:32 crc kubenswrapper[4816]: I0316 00:08:32.102319 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88-etc-openvswitch\") pod \"ovnkube-node-psjs7\" (UID: \"2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88\") " pod="openshift-ovn-kubernetes/ovnkube-node-psjs7" Mar 16 00:08:32 crc kubenswrapper[4816]: I0316 00:08:32.102358 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88-host-slash\") pod \"ovnkube-node-psjs7\" (UID: \"2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88\") " pod="openshift-ovn-kubernetes/ovnkube-node-psjs7" Mar 16 00:08:32 crc kubenswrapper[4816]: I0316 00:08:32.102386 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88-log-socket\") pod \"ovnkube-node-psjs7\" (UID: \"2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88\") " pod="openshift-ovn-kubernetes/ovnkube-node-psjs7" Mar 16 00:08:32 crc kubenswrapper[4816]: I0316 00:08:32.102742 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88-env-overrides\") pod \"ovnkube-node-psjs7\" (UID: \"2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88\") " pod="openshift-ovn-kubernetes/ovnkube-node-psjs7" Mar 16 00:08:32 crc kubenswrapper[4816]: I0316 00:08:32.102965 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88-ovnkube-script-lib\") pod \"ovnkube-node-psjs7\" (UID: \"2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88\") " pod="openshift-ovn-kubernetes/ovnkube-node-psjs7" Mar 16 00:08:32 crc kubenswrapper[4816]: I0316 00:08:32.106199 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88-ovn-node-metrics-cert\") pod \"ovnkube-node-psjs7\" (UID: \"2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88\") " pod="openshift-ovn-kubernetes/ovnkube-node-psjs7" Mar 16 00:08:32 crc kubenswrapper[4816]: I0316 00:08:32.113435 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-jrdcz" event={"ID":"dd08ece2-7636-4966-973a-e96a34b70b53","Type":"ContainerStarted","Data":"7003a4592a48f87ab63e9f5637c7665040f9784a7c8f599c82ed61270ddb793c"} Mar 16 00:08:32 crc kubenswrapper[4816]: I0316 00:08:32.113497 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-jrdcz" event={"ID":"dd08ece2-7636-4966-973a-e96a34b70b53","Type":"ContainerStarted","Data":"2f008ed10596c37892fa68d6a991ef4c4c25f62429883230c3721018781ad8a8"} Mar 16 00:08:32 crc kubenswrapper[4816]: I0316 00:08:32.114479 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-mt7bq" event={"ID":"03ef49f1-0c6a-443a-8df3-2db339c562ed","Type":"ContainerStarted","Data":"8979f751ec4797986924a3efea84d02bc1c1cf303e1e992f349c9083f2fe4f08"} Mar 16 00:08:32 crc kubenswrapper[4816]: I0316 00:08:32.114507 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-mt7bq" event={"ID":"03ef49f1-0c6a-443a-8df3-2db339c562ed","Type":"ContainerStarted","Data":"2322d7ca26005146ca00985bb21a70c27a081273e8ba13a44e439ebce554fff7"} Mar 16 00:08:32 crc kubenswrapper[4816]: I0316 00:08:32.116028 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-szscw" event={"ID":"e9789e58-12c8-4831-9401-af48a3e92209","Type":"ContainerStarted","Data":"e6f0b7e8f95bbbf3b90133349b8b13d2784582a5d95dae8dd159328b570ae962"} Mar 16 00:08:32 crc kubenswrapper[4816]: I0316 00:08:32.116074 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-szscw" event={"ID":"e9789e58-12c8-4831-9401-af48a3e92209","Type":"ContainerStarted","Data":"d14e6099281a3f4a5c54bb47b271be878d39affb726d275c2082db2728e837cc"} Mar 16 00:08:32 crc kubenswrapper[4816]: I0316 00:08:32.116514 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-mt7bq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"03ef49f1-0c6a-443a-8df3-2db339c562ed\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:08:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-mt7bq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:32Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:32 crc kubenswrapper[4816]: I0316 00:08:32.118826 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-cnhkf" event={"ID":"3e686cd4-bddf-463e-b471-e49ea862691e","Type":"ContainerStarted","Data":"8f6c3cacd4d7f6866e66b7234b280540ef2c443531de7a11f6894323ef24a9fb"} Mar 16 00:08:32 crc kubenswrapper[4816]: I0316 00:08:32.118863 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-cnhkf" event={"ID":"3e686cd4-bddf-463e-b471-e49ea862691e","Type":"ContainerStarted","Data":"ee4ee15b9147a70ac3d21b58b9d3cb23b0646c2fdf284bbf82e86743ee26a221"} Mar 16 00:08:32 crc kubenswrapper[4816]: I0316 00:08:32.121507 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9rd68\" (UniqueName: \"kubernetes.io/projected/2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88-kube-api-access-9rd68\") pod \"ovnkube-node-psjs7\" (UID: \"2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88\") " pod="openshift-ovn-kubernetes/ovnkube-node-psjs7" Mar 16 00:08:32 crc kubenswrapper[4816]: I0316 00:08:32.133729 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c8786d1-025d-4788-bafe-c2c2eaf8e398\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://46db23088e8ba791d736f70a65bd066af80a5ec27c31332d9ac9c52530b21bfd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://da7b257af12b9ee605dc32a26d9565f866a9cafebb4936a0bde7f42ae9909f80\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2523ea43f2490afb81354be8a2840f54a3be1a8d3154196e0c8ac365d09723a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://29c4319e41bd025417b11cb87b682a66698ba7575801f247b1192dc8067ab66f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://29c4319e41bd025417b11cb87b682a66698ba7575801f247b1192dc8067ab66f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-16T00:07:59Z\\\",\\\"message\\\":\\\"le observer\\\\nW0316 00:07:59.373922 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0316 00:07:59.374075 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0316 00:07:59.374860 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2696044728/tls.crt::/tmp/serving-cert-2696044728/tls.key\\\\\\\"\\\\nI0316 00:07:59.560928 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0316 00:07:59.563996 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0316 00:07:59.564013 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0316 00:07:59.564035 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0316 00:07:59.564041 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0316 00:07:59.571475 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0316 00:07:59.571523 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0316 00:07:59.571531 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0316 00:07:59.571540 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0316 00:07:59.571574 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0316 00:07:59.571588 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nI0316 00:07:59.571493 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0316 00:07:59.571597 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0316 00:07:59.573484 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-16T00:07:58Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c81d0c40be9c3912864280cd98481f837ef29f69d85599b39decf5e9d7a97eff\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:50Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e39d816de31fa1c42a254ec61527496e7e85121f67007a774524167b1063cf1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e39d816de31fa1c42a254ec61527496e7e85121f67007a774524167b1063cf1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:06:47Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:32Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:32 crc kubenswrapper[4816]: I0316 00:08:32.146628 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://516852108b3d17a15676573ff080d3d94fa48d8076ed0404a8d827e715c2555e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:32Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:32 crc kubenswrapper[4816]: I0316 00:08:32.161145 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c8786d1-025d-4788-bafe-c2c2eaf8e398\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://46db23088e8ba791d736f70a65bd066af80a5ec27c31332d9ac9c52530b21bfd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://da7b257af12b9ee605dc32a26d9565f866a9cafebb4936a0bde7f42ae9909f80\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2523ea43f2490afb81354be8a2840f54a3be1a8d3154196e0c8ac365d09723a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://29c4319e41bd025417b11cb87b682a66698ba7575801f247b1192dc8067ab66f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://29c4319e41bd025417b11cb87b682a66698ba7575801f247b1192dc8067ab66f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-16T00:07:59Z\\\",\\\"message\\\":\\\"le observer\\\\nW0316 00:07:59.373922 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0316 00:07:59.374075 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0316 00:07:59.374860 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2696044728/tls.crt::/tmp/serving-cert-2696044728/tls.key\\\\\\\"\\\\nI0316 00:07:59.560928 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0316 00:07:59.563996 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0316 00:07:59.564013 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0316 00:07:59.564035 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0316 00:07:59.564041 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0316 00:07:59.571475 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0316 00:07:59.571523 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0316 00:07:59.571531 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0316 00:07:59.571540 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0316 00:07:59.571574 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0316 00:07:59.571588 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nI0316 00:07:59.571493 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0316 00:07:59.571597 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0316 00:07:59.573484 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-16T00:07:58Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c81d0c40be9c3912864280cd98481f837ef29f69d85599b39decf5e9d7a97eff\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:50Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e39d816de31fa1c42a254ec61527496e7e85121f67007a774524167b1063cf1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e39d816de31fa1c42a254ec61527496e7e85121f67007a774524167b1063cf1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:06:47Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:32Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:32 crc kubenswrapper[4816]: I0316 00:08:32.175065 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://516852108b3d17a15676573ff080d3d94fa48d8076ed0404a8d827e715c2555e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:32Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:32 crc kubenswrapper[4816]: I0316 00:08:32.190225 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:32Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:32 crc kubenswrapper[4816]: I0316 00:08:32.194907 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:32 crc kubenswrapper[4816]: I0316 00:08:32.194954 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:32 crc kubenswrapper[4816]: I0316 00:08:32.194964 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:32 crc kubenswrapper[4816]: I0316 00:08:32.194979 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:32 crc kubenswrapper[4816]: I0316 00:08:32.194988 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:32Z","lastTransitionTime":"2026-03-16T00:08:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:32 crc kubenswrapper[4816]: I0316 00:08:32.207916 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-mt7bq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"03ef49f1-0c6a-443a-8df3-2db339c562ed\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8979f751ec4797986924a3efea84d02bc1c1cf303e1e992f349c9083f2fe4f08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:08:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-mt7bq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:32Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:32 crc kubenswrapper[4816]: I0316 00:08:32.208354 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-psjs7" Mar 16 00:08:32 crc kubenswrapper[4816]: W0316 00:08:32.218928 4816 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2ca6e6b1_6b9c_4bb0_8e08_8201c9c53e88.slice/crio-5a5bb9275c0f8cb8f5457ed5c2f6ecf42790ebe9298a9783a1a55a1c78e14761 WatchSource:0}: Error finding container 5a5bb9275c0f8cb8f5457ed5c2f6ecf42790ebe9298a9783a1a55a1c78e14761: Status 404 returned error can't find the container with id 5a5bb9275c0f8cb8f5457ed5c2f6ecf42790ebe9298a9783a1a55a1c78e14761 Mar 16 00:08:32 crc kubenswrapper[4816]: I0316 00:08:32.225095 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:32Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:32 crc kubenswrapper[4816]: I0316 00:08:32.239538 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:32Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:32 crc kubenswrapper[4816]: I0316 00:08:32.262908 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"49e35ef6-7a6d-438f-b598-4445d73db379\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fbe90b49c5d42bb9d6d5d1b8f1d02b571403dd3f39d3118081351d72c4910914\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://508e6dbe6f2f1392d16501e09ee2ec523a3c2a245cd96c6ab773e26e83b8bb6a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://15b5a3223ff0fb5b96e5b87ee836add66498165759f4d6ed8cb6413f091967e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://603bdcbd4a6af5fda7ca09e4823e0db8d7ec3e3eb38eddf9f062a14b433cb094\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8b061f16a65a27b9a5ef66231376b70fec084479a991b7fdd430957df76da32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b19c46be8bb0c6604bb4f702d14472b782280ce716a3b5fe87a4fa4f1b15b174\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b19c46be8bb0c6604bb4f702d14472b782280ce716a3b5fe87a4fa4f1b15b174\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://baceacd0418c0bbcbd2e875008286443b263336ba956d8e4584147ed949b3f69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://baceacd0418c0bbcbd2e875008286443b263336ba956d8e4584147ed949b3f69\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:49Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://24e7adc2ddce8a281cd270db47b3528d2a4965ab6d41a14ecb5aaea02daf0c45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://24e7adc2ddce8a281cd270db47b3528d2a4965ab6d41a14ecb5aaea02daf0c45\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:06:47Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:32Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:32 crc kubenswrapper[4816]: I0316 00:08:32.282450 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-szscw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e9789e58-12c8-4831-9401-af48a3e92209\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e6f0b7e8f95bbbf3b90133349b8b13d2784582a5d95dae8dd159328b570ae962\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mxf6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:08:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-szscw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:32Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:32 crc kubenswrapper[4816]: I0316 00:08:32.295673 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-jrdcz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dd08ece2-7636-4966-973a-e96a34b70b53\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trwbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trwbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:08:31Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-jrdcz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:32Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:32 crc kubenswrapper[4816]: I0316 00:08:32.296965 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:32 crc kubenswrapper[4816]: I0316 00:08:32.296994 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:32 crc kubenswrapper[4816]: I0316 00:08:32.297007 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:32 crc kubenswrapper[4816]: I0316 00:08:32.297023 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:32 crc kubenswrapper[4816]: I0316 00:08:32.297035 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:32Z","lastTransitionTime":"2026-03-16T00:08:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:32 crc kubenswrapper[4816]: I0316 00:08:32.323997 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-psjs7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:08:31Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-psjs7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:32Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:32 crc kubenswrapper[4816]: I0316 00:08:32.348226 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://723654bc91ce4d40a27b14a9c4df1ed6c8dff2517f378927d7a6a96aeaa6951b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:32Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:32 crc kubenswrapper[4816]: I0316 00:08:32.362025 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://37869baff974556c22aadadfa4b528b5d6b9ad236107b4dbc945b3cdbf743f20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f9fe727708fdaca68596b7d305629ade45b061a88a81085f885d490ab99e24b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:32Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:32 crc kubenswrapper[4816]: I0316 00:08:32.376809 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-cnhkf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e686cd4-bddf-463e-b471-e49ea862691e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f6c3cacd4d7f6866e66b7234b280540ef2c443531de7a11f6894323ef24a9fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bwzt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:08:31Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-cnhkf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:32Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:32 crc kubenswrapper[4816]: I0316 00:08:32.399162 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:32 crc kubenswrapper[4816]: I0316 00:08:32.399207 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:32 crc kubenswrapper[4816]: I0316 00:08:32.399216 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:32 crc kubenswrapper[4816]: I0316 00:08:32.399236 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:32 crc kubenswrapper[4816]: I0316 00:08:32.399246 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:32Z","lastTransitionTime":"2026-03-16T00:08:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:32 crc kubenswrapper[4816]: I0316 00:08:32.502751 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:32 crc kubenswrapper[4816]: I0316 00:08:32.502828 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:32 crc kubenswrapper[4816]: I0316 00:08:32.502847 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:32 crc kubenswrapper[4816]: I0316 00:08:32.502874 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:32 crc kubenswrapper[4816]: I0316 00:08:32.502892 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:32Z","lastTransitionTime":"2026-03-16T00:08:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:32 crc kubenswrapper[4816]: I0316 00:08:32.606100 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:32 crc kubenswrapper[4816]: I0316 00:08:32.606157 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:32 crc kubenswrapper[4816]: I0316 00:08:32.606178 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:32 crc kubenswrapper[4816]: I0316 00:08:32.606227 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:32 crc kubenswrapper[4816]: I0316 00:08:32.606251 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:32Z","lastTransitionTime":"2026-03-16T00:08:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:32 crc kubenswrapper[4816]: I0316 00:08:32.666866 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 16 00:08:32 crc kubenswrapper[4816]: I0316 00:08:32.666949 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 16 00:08:32 crc kubenswrapper[4816]: I0316 00:08:32.667037 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 16 00:08:32 crc kubenswrapper[4816]: E0316 00:08:32.667053 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 16 00:08:32 crc kubenswrapper[4816]: E0316 00:08:32.667128 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 16 00:08:32 crc kubenswrapper[4816]: E0316 00:08:32.667267 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 16 00:08:32 crc kubenswrapper[4816]: I0316 00:08:32.708789 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:32 crc kubenswrapper[4816]: I0316 00:08:32.708835 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:32 crc kubenswrapper[4816]: I0316 00:08:32.708846 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:32 crc kubenswrapper[4816]: I0316 00:08:32.708863 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:32 crc kubenswrapper[4816]: I0316 00:08:32.708875 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:32Z","lastTransitionTime":"2026-03-16T00:08:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:32 crc kubenswrapper[4816]: I0316 00:08:32.811836 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:32 crc kubenswrapper[4816]: I0316 00:08:32.811880 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:32 crc kubenswrapper[4816]: I0316 00:08:32.811894 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:32 crc kubenswrapper[4816]: I0316 00:08:32.811913 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:32 crc kubenswrapper[4816]: I0316 00:08:32.811927 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:32Z","lastTransitionTime":"2026-03-16T00:08:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:32 crc kubenswrapper[4816]: I0316 00:08:32.914945 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:32 crc kubenswrapper[4816]: I0316 00:08:32.914977 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:32 crc kubenswrapper[4816]: I0316 00:08:32.914985 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:32 crc kubenswrapper[4816]: I0316 00:08:32.915000 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:32 crc kubenswrapper[4816]: I0316 00:08:32.915009 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:32Z","lastTransitionTime":"2026-03-16T00:08:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:33 crc kubenswrapper[4816]: I0316 00:08:33.018613 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:33 crc kubenswrapper[4816]: I0316 00:08:33.018995 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:33 crc kubenswrapper[4816]: I0316 00:08:33.019013 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:33 crc kubenswrapper[4816]: I0316 00:08:33.019035 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:33 crc kubenswrapper[4816]: I0316 00:08:33.019051 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:33Z","lastTransitionTime":"2026-03-16T00:08:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:33 crc kubenswrapper[4816]: I0316 00:08:33.121833 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:33 crc kubenswrapper[4816]: I0316 00:08:33.121867 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:33 crc kubenswrapper[4816]: I0316 00:08:33.121874 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:33 crc kubenswrapper[4816]: I0316 00:08:33.121889 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:33 crc kubenswrapper[4816]: I0316 00:08:33.121898 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:33Z","lastTransitionTime":"2026-03-16T00:08:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:33 crc kubenswrapper[4816]: I0316 00:08:33.124594 4816 generic.go:334] "Generic (PLEG): container finished" podID="2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88" containerID="1249f8b1c01db98230c10bde49bdb68f17caccc1ba0546e462df4384ca1658dd" exitCode=0 Mar 16 00:08:33 crc kubenswrapper[4816]: I0316 00:08:33.124653 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-psjs7" event={"ID":"2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88","Type":"ContainerDied","Data":"1249f8b1c01db98230c10bde49bdb68f17caccc1ba0546e462df4384ca1658dd"} Mar 16 00:08:33 crc kubenswrapper[4816]: I0316 00:08:33.124675 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-psjs7" event={"ID":"2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88","Type":"ContainerStarted","Data":"5a5bb9275c0f8cb8f5457ed5c2f6ecf42790ebe9298a9783a1a55a1c78e14761"} Mar 16 00:08:33 crc kubenswrapper[4816]: I0316 00:08:33.128757 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-jrdcz" event={"ID":"dd08ece2-7636-4966-973a-e96a34b70b53","Type":"ContainerStarted","Data":"f6dd22d6548986bf126a8ce6b30c7c11d8c37bd0c03539a55cc28fb7debcfc26"} Mar 16 00:08:33 crc kubenswrapper[4816]: I0316 00:08:33.134187 4816 generic.go:334] "Generic (PLEG): container finished" podID="03ef49f1-0c6a-443a-8df3-2db339c562ed" containerID="8979f751ec4797986924a3efea84d02bc1c1cf303e1e992f349c9083f2fe4f08" exitCode=0 Mar 16 00:08:33 crc kubenswrapper[4816]: I0316 00:08:33.134254 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-mt7bq" event={"ID":"03ef49f1-0c6a-443a-8df3-2db339c562ed","Type":"ContainerDied","Data":"8979f751ec4797986924a3efea84d02bc1c1cf303e1e992f349c9083f2fe4f08"} Mar 16 00:08:33 crc kubenswrapper[4816]: I0316 00:08:33.143514 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-cnhkf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e686cd4-bddf-463e-b471-e49ea862691e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f6c3cacd4d7f6866e66b7234b280540ef2c443531de7a11f6894323ef24a9fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bwzt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:08:31Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-cnhkf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:33Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:33 crc kubenswrapper[4816]: I0316 00:08:33.168650 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://723654bc91ce4d40a27b14a9c4df1ed6c8dff2517f378927d7a6a96aeaa6951b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:33Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:33 crc kubenswrapper[4816]: I0316 00:08:33.188465 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://37869baff974556c22aadadfa4b528b5d6b9ad236107b4dbc945b3cdbf743f20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f9fe727708fdaca68596b7d305629ade45b061a88a81085f885d490ab99e24b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:33Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:33 crc kubenswrapper[4816]: I0316 00:08:33.210063 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-mt7bq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"03ef49f1-0c6a-443a-8df3-2db339c562ed\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8979f751ec4797986924a3efea84d02bc1c1cf303e1e992f349c9083f2fe4f08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:08:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-mt7bq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:33Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:33 crc kubenswrapper[4816]: I0316 00:08:33.224371 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:33 crc kubenswrapper[4816]: I0316 00:08:33.224416 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:33 crc kubenswrapper[4816]: I0316 00:08:33.224425 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:33 crc kubenswrapper[4816]: I0316 00:08:33.224442 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:33 crc kubenswrapper[4816]: I0316 00:08:33.224455 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:33Z","lastTransitionTime":"2026-03-16T00:08:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:33 crc kubenswrapper[4816]: I0316 00:08:33.230670 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c8786d1-025d-4788-bafe-c2c2eaf8e398\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://46db23088e8ba791d736f70a65bd066af80a5ec27c31332d9ac9c52530b21bfd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://da7b257af12b9ee605dc32a26d9565f866a9cafebb4936a0bde7f42ae9909f80\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2523ea43f2490afb81354be8a2840f54a3be1a8d3154196e0c8ac365d09723a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://29c4319e41bd025417b11cb87b682a66698ba7575801f247b1192dc8067ab66f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://29c4319e41bd025417b11cb87b682a66698ba7575801f247b1192dc8067ab66f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-16T00:07:59Z\\\",\\\"message\\\":\\\"le observer\\\\nW0316 00:07:59.373922 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0316 00:07:59.374075 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0316 00:07:59.374860 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2696044728/tls.crt::/tmp/serving-cert-2696044728/tls.key\\\\\\\"\\\\nI0316 00:07:59.560928 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0316 00:07:59.563996 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0316 00:07:59.564013 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0316 00:07:59.564035 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0316 00:07:59.564041 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0316 00:07:59.571475 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0316 00:07:59.571523 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0316 00:07:59.571531 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0316 00:07:59.571540 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0316 00:07:59.571574 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0316 00:07:59.571588 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nI0316 00:07:59.571493 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0316 00:07:59.571597 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0316 00:07:59.573484 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-16T00:07:58Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c81d0c40be9c3912864280cd98481f837ef29f69d85599b39decf5e9d7a97eff\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:50Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e39d816de31fa1c42a254ec61527496e7e85121f67007a774524167b1063cf1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e39d816de31fa1c42a254ec61527496e7e85121f67007a774524167b1063cf1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:06:47Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:33Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:33 crc kubenswrapper[4816]: I0316 00:08:33.247593 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://516852108b3d17a15676573ff080d3d94fa48d8076ed0404a8d827e715c2555e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:33Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:33 crc kubenswrapper[4816]: I0316 00:08:33.271199 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:33Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:33 crc kubenswrapper[4816]: I0316 00:08:33.287960 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:33Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:33 crc kubenswrapper[4816]: I0316 00:08:33.307306 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:33Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:33 crc kubenswrapper[4816]: I0316 00:08:33.327821 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:33 crc kubenswrapper[4816]: I0316 00:08:33.327861 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:33 crc kubenswrapper[4816]: I0316 00:08:33.327870 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:33 crc kubenswrapper[4816]: I0316 00:08:33.327885 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:33 crc kubenswrapper[4816]: I0316 00:08:33.327896 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:33Z","lastTransitionTime":"2026-03-16T00:08:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:33 crc kubenswrapper[4816]: I0316 00:08:33.331850 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-psjs7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1249f8b1c01db98230c10bde49bdb68f17caccc1ba0546e462df4384ca1658dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1249f8b1c01db98230c10bde49bdb68f17caccc1ba0546e462df4384ca1658dd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:08:31Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-psjs7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:33Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:33 crc kubenswrapper[4816]: I0316 00:08:33.357299 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"49e35ef6-7a6d-438f-b598-4445d73db379\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fbe90b49c5d42bb9d6d5d1b8f1d02b571403dd3f39d3118081351d72c4910914\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://508e6dbe6f2f1392d16501e09ee2ec523a3c2a245cd96c6ab773e26e83b8bb6a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://15b5a3223ff0fb5b96e5b87ee836add66498165759f4d6ed8cb6413f091967e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://603bdcbd4a6af5fda7ca09e4823e0db8d7ec3e3eb38eddf9f062a14b433cb094\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8b061f16a65a27b9a5ef66231376b70fec084479a991b7fdd430957df76da32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b19c46be8bb0c6604bb4f702d14472b782280ce716a3b5fe87a4fa4f1b15b174\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b19c46be8bb0c6604bb4f702d14472b782280ce716a3b5fe87a4fa4f1b15b174\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://baceacd0418c0bbcbd2e875008286443b263336ba956d8e4584147ed949b3f69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://baceacd0418c0bbcbd2e875008286443b263336ba956d8e4584147ed949b3f69\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:49Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://24e7adc2ddce8a281cd270db47b3528d2a4965ab6d41a14ecb5aaea02daf0c45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://24e7adc2ddce8a281cd270db47b3528d2a4965ab6d41a14ecb5aaea02daf0c45\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:06:47Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:33Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:33 crc kubenswrapper[4816]: I0316 00:08:33.380208 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-szscw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e9789e58-12c8-4831-9401-af48a3e92209\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e6f0b7e8f95bbbf3b90133349b8b13d2784582a5d95dae8dd159328b570ae962\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mxf6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:08:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-szscw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:33Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:33 crc kubenswrapper[4816]: I0316 00:08:33.396787 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-jrdcz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dd08ece2-7636-4966-973a-e96a34b70b53\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trwbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trwbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:08:31Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-jrdcz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:33Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:33 crc kubenswrapper[4816]: I0316 00:08:33.414821 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c8786d1-025d-4788-bafe-c2c2eaf8e398\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://46db23088e8ba791d736f70a65bd066af80a5ec27c31332d9ac9c52530b21bfd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://da7b257af12b9ee605dc32a26d9565f866a9cafebb4936a0bde7f42ae9909f80\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2523ea43f2490afb81354be8a2840f54a3be1a8d3154196e0c8ac365d09723a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://29c4319e41bd025417b11cb87b682a66698ba7575801f247b1192dc8067ab66f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://29c4319e41bd025417b11cb87b682a66698ba7575801f247b1192dc8067ab66f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-16T00:07:59Z\\\",\\\"message\\\":\\\"le observer\\\\nW0316 00:07:59.373922 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0316 00:07:59.374075 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0316 00:07:59.374860 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2696044728/tls.crt::/tmp/serving-cert-2696044728/tls.key\\\\\\\"\\\\nI0316 00:07:59.560928 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0316 00:07:59.563996 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0316 00:07:59.564013 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0316 00:07:59.564035 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0316 00:07:59.564041 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0316 00:07:59.571475 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0316 00:07:59.571523 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0316 00:07:59.571531 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0316 00:07:59.571540 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0316 00:07:59.571574 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0316 00:07:59.571588 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nI0316 00:07:59.571493 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0316 00:07:59.571597 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0316 00:07:59.573484 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-16T00:07:58Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c81d0c40be9c3912864280cd98481f837ef29f69d85599b39decf5e9d7a97eff\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:50Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e39d816de31fa1c42a254ec61527496e7e85121f67007a774524167b1063cf1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e39d816de31fa1c42a254ec61527496e7e85121f67007a774524167b1063cf1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:06:47Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:33Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:33 crc kubenswrapper[4816]: I0316 00:08:33.430466 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:33 crc kubenswrapper[4816]: I0316 00:08:33.430490 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:33 crc kubenswrapper[4816]: I0316 00:08:33.430498 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:33 crc kubenswrapper[4816]: I0316 00:08:33.430513 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:33 crc kubenswrapper[4816]: I0316 00:08:33.430522 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:33Z","lastTransitionTime":"2026-03-16T00:08:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:33 crc kubenswrapper[4816]: I0316 00:08:33.431996 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://516852108b3d17a15676573ff080d3d94fa48d8076ed0404a8d827e715c2555e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:33Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:33 crc kubenswrapper[4816]: I0316 00:08:33.451940 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:33Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:33 crc kubenswrapper[4816]: I0316 00:08:33.475946 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-mt7bq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"03ef49f1-0c6a-443a-8df3-2db339c562ed\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8979f751ec4797986924a3efea84d02bc1c1cf303e1e992f349c9083f2fe4f08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8979f751ec4797986924a3efea84d02bc1c1cf303e1e992f349c9083f2fe4f08\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:08:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-mt7bq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:33Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:33 crc kubenswrapper[4816]: I0316 00:08:33.496764 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:33Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:33 crc kubenswrapper[4816]: I0316 00:08:33.514511 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:33Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:33 crc kubenswrapper[4816]: I0316 00:08:33.536307 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:33 crc kubenswrapper[4816]: I0316 00:08:33.536352 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:33 crc kubenswrapper[4816]: I0316 00:08:33.536364 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:33 crc kubenswrapper[4816]: I0316 00:08:33.536384 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:33 crc kubenswrapper[4816]: I0316 00:08:33.536396 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:33Z","lastTransitionTime":"2026-03-16T00:08:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:33 crc kubenswrapper[4816]: I0316 00:08:33.543458 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"49e35ef6-7a6d-438f-b598-4445d73db379\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fbe90b49c5d42bb9d6d5d1b8f1d02b571403dd3f39d3118081351d72c4910914\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://508e6dbe6f2f1392d16501e09ee2ec523a3c2a245cd96c6ab773e26e83b8bb6a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://15b5a3223ff0fb5b96e5b87ee836add66498165759f4d6ed8cb6413f091967e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://603bdcbd4a6af5fda7ca09e4823e0db8d7ec3e3eb38eddf9f062a14b433cb094\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8b061f16a65a27b9a5ef66231376b70fec084479a991b7fdd430957df76da32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b19c46be8bb0c6604bb4f702d14472b782280ce716a3b5fe87a4fa4f1b15b174\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b19c46be8bb0c6604bb4f702d14472b782280ce716a3b5fe87a4fa4f1b15b174\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://baceacd0418c0bbcbd2e875008286443b263336ba956d8e4584147ed949b3f69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://baceacd0418c0bbcbd2e875008286443b263336ba956d8e4584147ed949b3f69\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:49Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://24e7adc2ddce8a281cd270db47b3528d2a4965ab6d41a14ecb5aaea02daf0c45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://24e7adc2ddce8a281cd270db47b3528d2a4965ab6d41a14ecb5aaea02daf0c45\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:06:47Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:33Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:33 crc kubenswrapper[4816]: I0316 00:08:33.560983 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-szscw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e9789e58-12c8-4831-9401-af48a3e92209\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e6f0b7e8f95bbbf3b90133349b8b13d2784582a5d95dae8dd159328b570ae962\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mxf6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:08:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-szscw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:33Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:33 crc kubenswrapper[4816]: I0316 00:08:33.574312 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-jrdcz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dd08ece2-7636-4966-973a-e96a34b70b53\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6dd22d6548986bf126a8ce6b30c7c11d8c37bd0c03539a55cc28fb7debcfc26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trwbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7003a4592a48f87ab63e9f5637c7665040f9784a7c8f599c82ed61270ddb793c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trwbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:08:31Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-jrdcz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:33Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:33 crc kubenswrapper[4816]: I0316 00:08:33.607081 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-psjs7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1249f8b1c01db98230c10bde49bdb68f17caccc1ba0546e462df4384ca1658dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1249f8b1c01db98230c10bde49bdb68f17caccc1ba0546e462df4384ca1658dd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:08:31Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-psjs7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:33Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:33 crc kubenswrapper[4816]: I0316 00:08:33.631918 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://723654bc91ce4d40a27b14a9c4df1ed6c8dff2517f378927d7a6a96aeaa6951b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:33Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:33 crc kubenswrapper[4816]: I0316 00:08:33.650337 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:33 crc kubenswrapper[4816]: I0316 00:08:33.650398 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:33 crc kubenswrapper[4816]: I0316 00:08:33.650416 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:33 crc kubenswrapper[4816]: I0316 00:08:33.650442 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:33 crc kubenswrapper[4816]: I0316 00:08:33.650459 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:33Z","lastTransitionTime":"2026-03-16T00:08:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:33 crc kubenswrapper[4816]: I0316 00:08:33.650885 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://37869baff974556c22aadadfa4b528b5d6b9ad236107b4dbc945b3cdbf743f20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f9fe727708fdaca68596b7d305629ade45b061a88a81085f885d490ab99e24b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:33Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:33 crc kubenswrapper[4816]: I0316 00:08:33.668329 4816 scope.go:117] "RemoveContainer" containerID="29c4319e41bd025417b11cb87b682a66698ba7575801f247b1192dc8067ab66f" Mar 16 00:08:33 crc kubenswrapper[4816]: I0316 00:08:33.668903 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-cnhkf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e686cd4-bddf-463e-b471-e49ea862691e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f6c3cacd4d7f6866e66b7234b280540ef2c443531de7a11f6894323ef24a9fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bwzt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:08:31Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-cnhkf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:33Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:33 crc kubenswrapper[4816]: E0316 00:08:33.670292 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 16 00:08:33 crc kubenswrapper[4816]: I0316 00:08:33.753977 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:33 crc kubenswrapper[4816]: I0316 00:08:33.754031 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:33 crc kubenswrapper[4816]: I0316 00:08:33.754046 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:33 crc kubenswrapper[4816]: I0316 00:08:33.754075 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:33 crc kubenswrapper[4816]: I0316 00:08:33.754087 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:33Z","lastTransitionTime":"2026-03-16T00:08:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:33 crc kubenswrapper[4816]: I0316 00:08:33.855992 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:33 crc kubenswrapper[4816]: I0316 00:08:33.856037 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:33 crc kubenswrapper[4816]: I0316 00:08:33.856049 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:33 crc kubenswrapper[4816]: I0316 00:08:33.856069 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:33 crc kubenswrapper[4816]: I0316 00:08:33.856090 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:33Z","lastTransitionTime":"2026-03-16T00:08:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:33 crc kubenswrapper[4816]: I0316 00:08:33.959113 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:33 crc kubenswrapper[4816]: I0316 00:08:33.959151 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:33 crc kubenswrapper[4816]: I0316 00:08:33.959163 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:33 crc kubenswrapper[4816]: I0316 00:08:33.959182 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:33 crc kubenswrapper[4816]: I0316 00:08:33.959194 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:33Z","lastTransitionTime":"2026-03-16T00:08:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:34 crc kubenswrapper[4816]: I0316 00:08:34.062749 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:34 crc kubenswrapper[4816]: I0316 00:08:34.062797 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:34 crc kubenswrapper[4816]: I0316 00:08:34.062810 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:34 crc kubenswrapper[4816]: I0316 00:08:34.062830 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:34 crc kubenswrapper[4816]: I0316 00:08:34.062843 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:34Z","lastTransitionTime":"2026-03-16T00:08:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:34 crc kubenswrapper[4816]: I0316 00:08:34.140234 4816 generic.go:334] "Generic (PLEG): container finished" podID="03ef49f1-0c6a-443a-8df3-2db339c562ed" containerID="fa87691b0dc314209d88d1f4d5b2227abc1de527e8111831bb8243ba6192c300" exitCode=0 Mar 16 00:08:34 crc kubenswrapper[4816]: I0316 00:08:34.140313 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-mt7bq" event={"ID":"03ef49f1-0c6a-443a-8df3-2db339c562ed","Type":"ContainerDied","Data":"fa87691b0dc314209d88d1f4d5b2227abc1de527e8111831bb8243ba6192c300"} Mar 16 00:08:34 crc kubenswrapper[4816]: I0316 00:08:34.153314 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-psjs7" event={"ID":"2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88","Type":"ContainerStarted","Data":"826495d8ab8f414169bf36159278da0a040fec699a74b4aebdc8f87ccdc48bd6"} Mar 16 00:08:34 crc kubenswrapper[4816]: I0316 00:08:34.153387 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-psjs7" event={"ID":"2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88","Type":"ContainerStarted","Data":"aa5f052d32ad8f52f33ee5baea5af98a64f05b831b86ebced6c9422fb4845fcf"} Mar 16 00:08:34 crc kubenswrapper[4816]: I0316 00:08:34.153410 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-psjs7" event={"ID":"2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88","Type":"ContainerStarted","Data":"4e0a510814d60106574e3cd3e612c031ac900caa0f24a12a62d7cd1f6bfda1ed"} Mar 16 00:08:34 crc kubenswrapper[4816]: I0316 00:08:34.153428 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-psjs7" event={"ID":"2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88","Type":"ContainerStarted","Data":"f47b6c54d5d72d827208f4d8228be260d9d19b4a720b3174349288dee2916641"} Mar 16 00:08:34 crc kubenswrapper[4816]: I0316 00:08:34.153447 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-psjs7" event={"ID":"2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88","Type":"ContainerStarted","Data":"86c28f1ae2f2fb61b7755ac14bc7f5346036735c75daccaa0e31b4e606cfb4c2"} Mar 16 00:08:34 crc kubenswrapper[4816]: I0316 00:08:34.153464 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-psjs7" event={"ID":"2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88","Type":"ContainerStarted","Data":"0fac0a986edf984b0e34eda0f969205d5bc92f23d0edacb3c6dd8721eaed2fb9"} Mar 16 00:08:34 crc kubenswrapper[4816]: I0316 00:08:34.162482 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:34Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:34 crc kubenswrapper[4816]: I0316 00:08:34.166437 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:34 crc kubenswrapper[4816]: I0316 00:08:34.166475 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:34 crc kubenswrapper[4816]: I0316 00:08:34.166490 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:34 crc kubenswrapper[4816]: I0316 00:08:34.166513 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:34 crc kubenswrapper[4816]: I0316 00:08:34.166530 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:34Z","lastTransitionTime":"2026-03-16T00:08:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:34 crc kubenswrapper[4816]: I0316 00:08:34.183019 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:34Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:34 crc kubenswrapper[4816]: I0316 00:08:34.214602 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"49e35ef6-7a6d-438f-b598-4445d73db379\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fbe90b49c5d42bb9d6d5d1b8f1d02b571403dd3f39d3118081351d72c4910914\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://508e6dbe6f2f1392d16501e09ee2ec523a3c2a245cd96c6ab773e26e83b8bb6a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://15b5a3223ff0fb5b96e5b87ee836add66498165759f4d6ed8cb6413f091967e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://603bdcbd4a6af5fda7ca09e4823e0db8d7ec3e3eb38eddf9f062a14b433cb094\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8b061f16a65a27b9a5ef66231376b70fec084479a991b7fdd430957df76da32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b19c46be8bb0c6604bb4f702d14472b782280ce716a3b5fe87a4fa4f1b15b174\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b19c46be8bb0c6604bb4f702d14472b782280ce716a3b5fe87a4fa4f1b15b174\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://baceacd0418c0bbcbd2e875008286443b263336ba956d8e4584147ed949b3f69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://baceacd0418c0bbcbd2e875008286443b263336ba956d8e4584147ed949b3f69\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:49Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://24e7adc2ddce8a281cd270db47b3528d2a4965ab6d41a14ecb5aaea02daf0c45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://24e7adc2ddce8a281cd270db47b3528d2a4965ab6d41a14ecb5aaea02daf0c45\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:06:47Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:34Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:34 crc kubenswrapper[4816]: I0316 00:08:34.227942 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-szscw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e9789e58-12c8-4831-9401-af48a3e92209\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e6f0b7e8f95bbbf3b90133349b8b13d2784582a5d95dae8dd159328b570ae962\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mxf6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:08:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-szscw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:34Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:34 crc kubenswrapper[4816]: I0316 00:08:34.240404 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-jrdcz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dd08ece2-7636-4966-973a-e96a34b70b53\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6dd22d6548986bf126a8ce6b30c7c11d8c37bd0c03539a55cc28fb7debcfc26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trwbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7003a4592a48f87ab63e9f5637c7665040f9784a7c8f599c82ed61270ddb793c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trwbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:08:31Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-jrdcz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:34Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:34 crc kubenswrapper[4816]: I0316 00:08:34.261432 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-psjs7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1249f8b1c01db98230c10bde49bdb68f17caccc1ba0546e462df4384ca1658dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1249f8b1c01db98230c10bde49bdb68f17caccc1ba0546e462df4384ca1658dd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:08:31Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-psjs7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:34Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:34 crc kubenswrapper[4816]: I0316 00:08:34.270101 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:34 crc kubenswrapper[4816]: I0316 00:08:34.270158 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:34 crc kubenswrapper[4816]: I0316 00:08:34.270174 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:34 crc kubenswrapper[4816]: I0316 00:08:34.270198 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:34 crc kubenswrapper[4816]: I0316 00:08:34.270213 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:34Z","lastTransitionTime":"2026-03-16T00:08:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:34 crc kubenswrapper[4816]: I0316 00:08:34.278028 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://723654bc91ce4d40a27b14a9c4df1ed6c8dff2517f378927d7a6a96aeaa6951b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:34Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:34 crc kubenswrapper[4816]: I0316 00:08:34.292459 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://37869baff974556c22aadadfa4b528b5d6b9ad236107b4dbc945b3cdbf743f20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f9fe727708fdaca68596b7d305629ade45b061a88a81085f885d490ab99e24b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:34Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:34 crc kubenswrapper[4816]: I0316 00:08:34.303144 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-cnhkf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e686cd4-bddf-463e-b471-e49ea862691e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f6c3cacd4d7f6866e66b7234b280540ef2c443531de7a11f6894323ef24a9fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bwzt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:08:31Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-cnhkf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:34Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:34 crc kubenswrapper[4816]: I0316 00:08:34.316949 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c8786d1-025d-4788-bafe-c2c2eaf8e398\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://46db23088e8ba791d736f70a65bd066af80a5ec27c31332d9ac9c52530b21bfd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://da7b257af12b9ee605dc32a26d9565f866a9cafebb4936a0bde7f42ae9909f80\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2523ea43f2490afb81354be8a2840f54a3be1a8d3154196e0c8ac365d09723a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://29c4319e41bd025417b11cb87b682a66698ba7575801f247b1192dc8067ab66f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://29c4319e41bd025417b11cb87b682a66698ba7575801f247b1192dc8067ab66f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-16T00:07:59Z\\\",\\\"message\\\":\\\"le observer\\\\nW0316 00:07:59.373922 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0316 00:07:59.374075 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0316 00:07:59.374860 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2696044728/tls.crt::/tmp/serving-cert-2696044728/tls.key\\\\\\\"\\\\nI0316 00:07:59.560928 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0316 00:07:59.563996 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0316 00:07:59.564013 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0316 00:07:59.564035 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0316 00:07:59.564041 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0316 00:07:59.571475 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0316 00:07:59.571523 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0316 00:07:59.571531 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0316 00:07:59.571540 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0316 00:07:59.571574 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0316 00:07:59.571588 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nI0316 00:07:59.571493 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0316 00:07:59.571597 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0316 00:07:59.573484 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-16T00:07:58Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c81d0c40be9c3912864280cd98481f837ef29f69d85599b39decf5e9d7a97eff\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:50Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e39d816de31fa1c42a254ec61527496e7e85121f67007a774524167b1063cf1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e39d816de31fa1c42a254ec61527496e7e85121f67007a774524167b1063cf1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:06:47Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:34Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:34 crc kubenswrapper[4816]: I0316 00:08:34.331886 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://516852108b3d17a15676573ff080d3d94fa48d8076ed0404a8d827e715c2555e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:34Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:34 crc kubenswrapper[4816]: I0316 00:08:34.346295 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:34Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:34 crc kubenswrapper[4816]: I0316 00:08:34.365578 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-mt7bq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"03ef49f1-0c6a-443a-8df3-2db339c562ed\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8979f751ec4797986924a3efea84d02bc1c1cf303e1e992f349c9083f2fe4f08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8979f751ec4797986924a3efea84d02bc1c1cf303e1e992f349c9083f2fe4f08\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa87691b0dc314209d88d1f4d5b2227abc1de527e8111831bb8243ba6192c300\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fa87691b0dc314209d88d1f4d5b2227abc1de527e8111831bb8243ba6192c300\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:08:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:08:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-mt7bq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:34Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:34 crc kubenswrapper[4816]: I0316 00:08:34.372781 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:34 crc kubenswrapper[4816]: I0316 00:08:34.372808 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:34 crc kubenswrapper[4816]: I0316 00:08:34.372816 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:34 crc kubenswrapper[4816]: I0316 00:08:34.372829 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:34 crc kubenswrapper[4816]: I0316 00:08:34.372839 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:34Z","lastTransitionTime":"2026-03-16T00:08:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:34 crc kubenswrapper[4816]: I0316 00:08:34.475418 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:34 crc kubenswrapper[4816]: I0316 00:08:34.475460 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:34 crc kubenswrapper[4816]: I0316 00:08:34.475470 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:34 crc kubenswrapper[4816]: I0316 00:08:34.475487 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:34 crc kubenswrapper[4816]: I0316 00:08:34.475499 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:34Z","lastTransitionTime":"2026-03-16T00:08:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:34 crc kubenswrapper[4816]: I0316 00:08:34.578509 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:34 crc kubenswrapper[4816]: I0316 00:08:34.578601 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:34 crc kubenswrapper[4816]: I0316 00:08:34.578623 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:34 crc kubenswrapper[4816]: I0316 00:08:34.578647 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:34 crc kubenswrapper[4816]: I0316 00:08:34.578664 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:34Z","lastTransitionTime":"2026-03-16T00:08:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:34 crc kubenswrapper[4816]: I0316 00:08:34.667367 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 16 00:08:34 crc kubenswrapper[4816]: I0316 00:08:34.667417 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 16 00:08:34 crc kubenswrapper[4816]: I0316 00:08:34.667491 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 16 00:08:34 crc kubenswrapper[4816]: E0316 00:08:34.667683 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 16 00:08:34 crc kubenswrapper[4816]: E0316 00:08:34.667869 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 16 00:08:34 crc kubenswrapper[4816]: E0316 00:08:34.668004 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 16 00:08:34 crc kubenswrapper[4816]: I0316 00:08:34.681332 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:34 crc kubenswrapper[4816]: I0316 00:08:34.681385 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:34 crc kubenswrapper[4816]: I0316 00:08:34.681403 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:34 crc kubenswrapper[4816]: I0316 00:08:34.681426 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:34 crc kubenswrapper[4816]: I0316 00:08:34.681443 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:34Z","lastTransitionTime":"2026-03-16T00:08:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:34 crc kubenswrapper[4816]: I0316 00:08:34.787666 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:34 crc kubenswrapper[4816]: I0316 00:08:34.787771 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:34 crc kubenswrapper[4816]: I0316 00:08:34.787829 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:34 crc kubenswrapper[4816]: I0316 00:08:34.787878 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:34 crc kubenswrapper[4816]: I0316 00:08:34.787907 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:34Z","lastTransitionTime":"2026-03-16T00:08:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:34 crc kubenswrapper[4816]: I0316 00:08:34.891365 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:34 crc kubenswrapper[4816]: I0316 00:08:34.891649 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:34 crc kubenswrapper[4816]: I0316 00:08:34.891676 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:34 crc kubenswrapper[4816]: I0316 00:08:34.891701 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:34 crc kubenswrapper[4816]: I0316 00:08:34.891719 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:34Z","lastTransitionTime":"2026-03-16T00:08:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:34 crc kubenswrapper[4816]: I0316 00:08:34.994353 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:34 crc kubenswrapper[4816]: I0316 00:08:34.994438 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:34 crc kubenswrapper[4816]: I0316 00:08:34.994453 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:34 crc kubenswrapper[4816]: I0316 00:08:34.994472 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:34 crc kubenswrapper[4816]: I0316 00:08:34.994487 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:34Z","lastTransitionTime":"2026-03-16T00:08:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:35 crc kubenswrapper[4816]: I0316 00:08:35.097006 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:35 crc kubenswrapper[4816]: I0316 00:08:35.097071 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:35 crc kubenswrapper[4816]: I0316 00:08:35.097088 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:35 crc kubenswrapper[4816]: I0316 00:08:35.097115 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:35 crc kubenswrapper[4816]: I0316 00:08:35.097133 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:35Z","lastTransitionTime":"2026-03-16T00:08:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:35 crc kubenswrapper[4816]: I0316 00:08:35.160188 4816 generic.go:334] "Generic (PLEG): container finished" podID="03ef49f1-0c6a-443a-8df3-2db339c562ed" containerID="33f82b993e3d729a5830e88136e74687c930d6ad1a15c64f003f311fda418bce" exitCode=0 Mar 16 00:08:35 crc kubenswrapper[4816]: I0316 00:08:35.160259 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-mt7bq" event={"ID":"03ef49f1-0c6a-443a-8df3-2db339c562ed","Type":"ContainerDied","Data":"33f82b993e3d729a5830e88136e74687c930d6ad1a15c64f003f311fda418bce"} Mar 16 00:08:35 crc kubenswrapper[4816]: I0316 00:08:35.203206 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:35 crc kubenswrapper[4816]: I0316 00:08:35.203271 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:35 crc kubenswrapper[4816]: I0316 00:08:35.203289 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:35 crc kubenswrapper[4816]: I0316 00:08:35.203316 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:35 crc kubenswrapper[4816]: I0316 00:08:35.203333 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:35Z","lastTransitionTime":"2026-03-16T00:08:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:35 crc kubenswrapper[4816]: I0316 00:08:35.205148 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"49e35ef6-7a6d-438f-b598-4445d73db379\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fbe90b49c5d42bb9d6d5d1b8f1d02b571403dd3f39d3118081351d72c4910914\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://508e6dbe6f2f1392d16501e09ee2ec523a3c2a245cd96c6ab773e26e83b8bb6a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://15b5a3223ff0fb5b96e5b87ee836add66498165759f4d6ed8cb6413f091967e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://603bdcbd4a6af5fda7ca09e4823e0db8d7ec3e3eb38eddf9f062a14b433cb094\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8b061f16a65a27b9a5ef66231376b70fec084479a991b7fdd430957df76da32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b19c46be8bb0c6604bb4f702d14472b782280ce716a3b5fe87a4fa4f1b15b174\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b19c46be8bb0c6604bb4f702d14472b782280ce716a3b5fe87a4fa4f1b15b174\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://baceacd0418c0bbcbd2e875008286443b263336ba956d8e4584147ed949b3f69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://baceacd0418c0bbcbd2e875008286443b263336ba956d8e4584147ed949b3f69\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:49Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://24e7adc2ddce8a281cd270db47b3528d2a4965ab6d41a14ecb5aaea02daf0c45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://24e7adc2ddce8a281cd270db47b3528d2a4965ab6d41a14ecb5aaea02daf0c45\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:06:47Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:35Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:35 crc kubenswrapper[4816]: I0316 00:08:35.226671 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-szscw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e9789e58-12c8-4831-9401-af48a3e92209\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e6f0b7e8f95bbbf3b90133349b8b13d2784582a5d95dae8dd159328b570ae962\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mxf6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:08:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-szscw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:35Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:35 crc kubenswrapper[4816]: I0316 00:08:35.245481 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-jrdcz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dd08ece2-7636-4966-973a-e96a34b70b53\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6dd22d6548986bf126a8ce6b30c7c11d8c37bd0c03539a55cc28fb7debcfc26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trwbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7003a4592a48f87ab63e9f5637c7665040f9784a7c8f599c82ed61270ddb793c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trwbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:08:31Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-jrdcz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:35Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:35 crc kubenswrapper[4816]: I0316 00:08:35.275588 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-psjs7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1249f8b1c01db98230c10bde49bdb68f17caccc1ba0546e462df4384ca1658dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1249f8b1c01db98230c10bde49bdb68f17caccc1ba0546e462df4384ca1658dd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:08:31Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-psjs7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:35Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:35 crc kubenswrapper[4816]: I0316 00:08:35.292686 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://723654bc91ce4d40a27b14a9c4df1ed6c8dff2517f378927d7a6a96aeaa6951b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:35Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:35 crc kubenswrapper[4816]: I0316 00:08:35.307380 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:35 crc kubenswrapper[4816]: I0316 00:08:35.307433 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:35 crc kubenswrapper[4816]: I0316 00:08:35.307448 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:35 crc kubenswrapper[4816]: I0316 00:08:35.307469 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:35 crc kubenswrapper[4816]: I0316 00:08:35.307483 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:35Z","lastTransitionTime":"2026-03-16T00:08:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:35 crc kubenswrapper[4816]: I0316 00:08:35.312504 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://37869baff974556c22aadadfa4b528b5d6b9ad236107b4dbc945b3cdbf743f20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f9fe727708fdaca68596b7d305629ade45b061a88a81085f885d490ab99e24b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:35Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:35 crc kubenswrapper[4816]: I0316 00:08:35.326875 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-cnhkf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e686cd4-bddf-463e-b471-e49ea862691e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f6c3cacd4d7f6866e66b7234b280540ef2c443531de7a11f6894323ef24a9fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bwzt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:08:31Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-cnhkf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:35Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:35 crc kubenswrapper[4816]: I0316 00:08:35.345167 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c8786d1-025d-4788-bafe-c2c2eaf8e398\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://46db23088e8ba791d736f70a65bd066af80a5ec27c31332d9ac9c52530b21bfd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://da7b257af12b9ee605dc32a26d9565f866a9cafebb4936a0bde7f42ae9909f80\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2523ea43f2490afb81354be8a2840f54a3be1a8d3154196e0c8ac365d09723a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://29c4319e41bd025417b11cb87b682a66698ba7575801f247b1192dc8067ab66f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://29c4319e41bd025417b11cb87b682a66698ba7575801f247b1192dc8067ab66f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-16T00:07:59Z\\\",\\\"message\\\":\\\"le observer\\\\nW0316 00:07:59.373922 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0316 00:07:59.374075 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0316 00:07:59.374860 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2696044728/tls.crt::/tmp/serving-cert-2696044728/tls.key\\\\\\\"\\\\nI0316 00:07:59.560928 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0316 00:07:59.563996 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0316 00:07:59.564013 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0316 00:07:59.564035 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0316 00:07:59.564041 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0316 00:07:59.571475 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0316 00:07:59.571523 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0316 00:07:59.571531 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0316 00:07:59.571540 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0316 00:07:59.571574 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0316 00:07:59.571588 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nI0316 00:07:59.571493 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0316 00:07:59.571597 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0316 00:07:59.573484 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-16T00:07:58Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c81d0c40be9c3912864280cd98481f837ef29f69d85599b39decf5e9d7a97eff\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:50Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e39d816de31fa1c42a254ec61527496e7e85121f67007a774524167b1063cf1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e39d816de31fa1c42a254ec61527496e7e85121f67007a774524167b1063cf1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:06:47Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:35Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:35 crc kubenswrapper[4816]: I0316 00:08:35.358323 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://516852108b3d17a15676573ff080d3d94fa48d8076ed0404a8d827e715c2555e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:35Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:35 crc kubenswrapper[4816]: I0316 00:08:35.371191 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:35Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:35 crc kubenswrapper[4816]: I0316 00:08:35.387407 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-mt7bq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"03ef49f1-0c6a-443a-8df3-2db339c562ed\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8979f751ec4797986924a3efea84d02bc1c1cf303e1e992f349c9083f2fe4f08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8979f751ec4797986924a3efea84d02bc1c1cf303e1e992f349c9083f2fe4f08\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa87691b0dc314209d88d1f4d5b2227abc1de527e8111831bb8243ba6192c300\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fa87691b0dc314209d88d1f4d5b2227abc1de527e8111831bb8243ba6192c300\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:08:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://33f82b993e3d729a5830e88136e74687c930d6ad1a15c64f003f311fda418bce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://33f82b993e3d729a5830e88136e74687c930d6ad1a15c64f003f311fda418bce\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:08:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:08:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-mt7bq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:35Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:35 crc kubenswrapper[4816]: I0316 00:08:35.402571 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:35Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:35 crc kubenswrapper[4816]: I0316 00:08:35.409672 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:35 crc kubenswrapper[4816]: I0316 00:08:35.409704 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:35 crc kubenswrapper[4816]: I0316 00:08:35.409713 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:35 crc kubenswrapper[4816]: I0316 00:08:35.409729 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:35 crc kubenswrapper[4816]: I0316 00:08:35.409738 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:35Z","lastTransitionTime":"2026-03-16T00:08:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:35 crc kubenswrapper[4816]: I0316 00:08:35.417039 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:35Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:35 crc kubenswrapper[4816]: I0316 00:08:35.513632 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:35 crc kubenswrapper[4816]: I0316 00:08:35.513714 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:35 crc kubenswrapper[4816]: I0316 00:08:35.513756 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:35 crc kubenswrapper[4816]: I0316 00:08:35.513777 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:35 crc kubenswrapper[4816]: I0316 00:08:35.513831 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:35Z","lastTransitionTime":"2026-03-16T00:08:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:35 crc kubenswrapper[4816]: I0316 00:08:35.616957 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:35 crc kubenswrapper[4816]: I0316 00:08:35.617005 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:35 crc kubenswrapper[4816]: I0316 00:08:35.617017 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:35 crc kubenswrapper[4816]: I0316 00:08:35.617036 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:35 crc kubenswrapper[4816]: I0316 00:08:35.617048 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:35Z","lastTransitionTime":"2026-03-16T00:08:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:35 crc kubenswrapper[4816]: I0316 00:08:35.721599 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:35 crc kubenswrapper[4816]: I0316 00:08:35.721668 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:35 crc kubenswrapper[4816]: I0316 00:08:35.721685 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:35 crc kubenswrapper[4816]: I0316 00:08:35.721712 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:35 crc kubenswrapper[4816]: I0316 00:08:35.721729 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:35Z","lastTransitionTime":"2026-03-16T00:08:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:35 crc kubenswrapper[4816]: I0316 00:08:35.824919 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:35 crc kubenswrapper[4816]: I0316 00:08:35.824988 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:35 crc kubenswrapper[4816]: I0316 00:08:35.825018 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:35 crc kubenswrapper[4816]: I0316 00:08:35.825047 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:35 crc kubenswrapper[4816]: I0316 00:08:35.825073 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:35Z","lastTransitionTime":"2026-03-16T00:08:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:35 crc kubenswrapper[4816]: I0316 00:08:35.929257 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:35 crc kubenswrapper[4816]: I0316 00:08:35.929320 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:35 crc kubenswrapper[4816]: I0316 00:08:35.929340 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:35 crc kubenswrapper[4816]: I0316 00:08:35.929364 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:35 crc kubenswrapper[4816]: I0316 00:08:35.929381 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:35Z","lastTransitionTime":"2026-03-16T00:08:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:36 crc kubenswrapper[4816]: I0316 00:08:36.032616 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:36 crc kubenswrapper[4816]: I0316 00:08:36.032686 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:36 crc kubenswrapper[4816]: I0316 00:08:36.032706 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:36 crc kubenswrapper[4816]: I0316 00:08:36.032737 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:36 crc kubenswrapper[4816]: I0316 00:08:36.032764 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:36Z","lastTransitionTime":"2026-03-16T00:08:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:36 crc kubenswrapper[4816]: I0316 00:08:36.135747 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:36 crc kubenswrapper[4816]: I0316 00:08:36.135781 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:36 crc kubenswrapper[4816]: I0316 00:08:36.135793 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:36 crc kubenswrapper[4816]: I0316 00:08:36.135809 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:36 crc kubenswrapper[4816]: I0316 00:08:36.135820 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:36Z","lastTransitionTime":"2026-03-16T00:08:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:36 crc kubenswrapper[4816]: I0316 00:08:36.166681 4816 generic.go:334] "Generic (PLEG): container finished" podID="03ef49f1-0c6a-443a-8df3-2db339c562ed" containerID="b640c26797535db11051d50aefd13b7b2759cfa19d163f53764b0fe8710cefdc" exitCode=0 Mar 16 00:08:36 crc kubenswrapper[4816]: I0316 00:08:36.166732 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-mt7bq" event={"ID":"03ef49f1-0c6a-443a-8df3-2db339c562ed","Type":"ContainerDied","Data":"b640c26797535db11051d50aefd13b7b2759cfa19d163f53764b0fe8710cefdc"} Mar 16 00:08:36 crc kubenswrapper[4816]: I0316 00:08:36.186623 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:36Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:36 crc kubenswrapper[4816]: I0316 00:08:36.201586 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:36Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:36 crc kubenswrapper[4816]: I0316 00:08:36.230656 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"49e35ef6-7a6d-438f-b598-4445d73db379\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fbe90b49c5d42bb9d6d5d1b8f1d02b571403dd3f39d3118081351d72c4910914\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://508e6dbe6f2f1392d16501e09ee2ec523a3c2a245cd96c6ab773e26e83b8bb6a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://15b5a3223ff0fb5b96e5b87ee836add66498165759f4d6ed8cb6413f091967e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://603bdcbd4a6af5fda7ca09e4823e0db8d7ec3e3eb38eddf9f062a14b433cb094\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8b061f16a65a27b9a5ef66231376b70fec084479a991b7fdd430957df76da32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b19c46be8bb0c6604bb4f702d14472b782280ce716a3b5fe87a4fa4f1b15b174\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b19c46be8bb0c6604bb4f702d14472b782280ce716a3b5fe87a4fa4f1b15b174\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://baceacd0418c0bbcbd2e875008286443b263336ba956d8e4584147ed949b3f69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://baceacd0418c0bbcbd2e875008286443b263336ba956d8e4584147ed949b3f69\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:49Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://24e7adc2ddce8a281cd270db47b3528d2a4965ab6d41a14ecb5aaea02daf0c45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://24e7adc2ddce8a281cd270db47b3528d2a4965ab6d41a14ecb5aaea02daf0c45\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:06:47Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:36Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:36 crc kubenswrapper[4816]: I0316 00:08:36.246351 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:36 crc kubenswrapper[4816]: I0316 00:08:36.246406 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:36 crc kubenswrapper[4816]: I0316 00:08:36.246418 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:36 crc kubenswrapper[4816]: I0316 00:08:36.246443 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:36 crc kubenswrapper[4816]: I0316 00:08:36.246463 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:36Z","lastTransitionTime":"2026-03-16T00:08:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:36 crc kubenswrapper[4816]: I0316 00:08:36.252403 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-szscw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e9789e58-12c8-4831-9401-af48a3e92209\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e6f0b7e8f95bbbf3b90133349b8b13d2784582a5d95dae8dd159328b570ae962\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mxf6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:08:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-szscw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:36Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:36 crc kubenswrapper[4816]: I0316 00:08:36.270010 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-jrdcz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dd08ece2-7636-4966-973a-e96a34b70b53\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6dd22d6548986bf126a8ce6b30c7c11d8c37bd0c03539a55cc28fb7debcfc26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trwbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7003a4592a48f87ab63e9f5637c7665040f9784a7c8f599c82ed61270ddb793c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trwbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:08:31Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-jrdcz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:36Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:36 crc kubenswrapper[4816]: I0316 00:08:36.291629 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-psjs7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1249f8b1c01db98230c10bde49bdb68f17caccc1ba0546e462df4384ca1658dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1249f8b1c01db98230c10bde49bdb68f17caccc1ba0546e462df4384ca1658dd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:08:31Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-psjs7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:36Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:36 crc kubenswrapper[4816]: I0316 00:08:36.306286 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://723654bc91ce4d40a27b14a9c4df1ed6c8dff2517f378927d7a6a96aeaa6951b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:36Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:36 crc kubenswrapper[4816]: I0316 00:08:36.326359 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://37869baff974556c22aadadfa4b528b5d6b9ad236107b4dbc945b3cdbf743f20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f9fe727708fdaca68596b7d305629ade45b061a88a81085f885d490ab99e24b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:36Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:36 crc kubenswrapper[4816]: I0316 00:08:36.349643 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:36 crc kubenswrapper[4816]: I0316 00:08:36.349710 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:36 crc kubenswrapper[4816]: I0316 00:08:36.349725 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:36 crc kubenswrapper[4816]: I0316 00:08:36.349749 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:36 crc kubenswrapper[4816]: I0316 00:08:36.349765 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:36Z","lastTransitionTime":"2026-03-16T00:08:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:36 crc kubenswrapper[4816]: I0316 00:08:36.361151 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-cnhkf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e686cd4-bddf-463e-b471-e49ea862691e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f6c3cacd4d7f6866e66b7234b280540ef2c443531de7a11f6894323ef24a9fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bwzt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:08:31Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-cnhkf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:36Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:36 crc kubenswrapper[4816]: I0316 00:08:36.391845 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c8786d1-025d-4788-bafe-c2c2eaf8e398\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://46db23088e8ba791d736f70a65bd066af80a5ec27c31332d9ac9c52530b21bfd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://da7b257af12b9ee605dc32a26d9565f866a9cafebb4936a0bde7f42ae9909f80\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2523ea43f2490afb81354be8a2840f54a3be1a8d3154196e0c8ac365d09723a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://29c4319e41bd025417b11cb87b682a66698ba7575801f247b1192dc8067ab66f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://29c4319e41bd025417b11cb87b682a66698ba7575801f247b1192dc8067ab66f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-16T00:07:59Z\\\",\\\"message\\\":\\\"le observer\\\\nW0316 00:07:59.373922 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0316 00:07:59.374075 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0316 00:07:59.374860 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2696044728/tls.crt::/tmp/serving-cert-2696044728/tls.key\\\\\\\"\\\\nI0316 00:07:59.560928 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0316 00:07:59.563996 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0316 00:07:59.564013 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0316 00:07:59.564035 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0316 00:07:59.564041 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0316 00:07:59.571475 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0316 00:07:59.571523 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0316 00:07:59.571531 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0316 00:07:59.571540 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0316 00:07:59.571574 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0316 00:07:59.571588 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nI0316 00:07:59.571493 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0316 00:07:59.571597 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0316 00:07:59.573484 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-16T00:07:58Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c81d0c40be9c3912864280cd98481f837ef29f69d85599b39decf5e9d7a97eff\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:50Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e39d816de31fa1c42a254ec61527496e7e85121f67007a774524167b1063cf1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e39d816de31fa1c42a254ec61527496e7e85121f67007a774524167b1063cf1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:06:47Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:36Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:36 crc kubenswrapper[4816]: I0316 00:08:36.408640 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://516852108b3d17a15676573ff080d3d94fa48d8076ed0404a8d827e715c2555e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:36Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:36 crc kubenswrapper[4816]: I0316 00:08:36.421286 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:36Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:36 crc kubenswrapper[4816]: I0316 00:08:36.437930 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-mt7bq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"03ef49f1-0c6a-443a-8df3-2db339c562ed\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8979f751ec4797986924a3efea84d02bc1c1cf303e1e992f349c9083f2fe4f08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8979f751ec4797986924a3efea84d02bc1c1cf303e1e992f349c9083f2fe4f08\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa87691b0dc314209d88d1f4d5b2227abc1de527e8111831bb8243ba6192c300\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fa87691b0dc314209d88d1f4d5b2227abc1de527e8111831bb8243ba6192c300\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:08:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://33f82b993e3d729a5830e88136e74687c930d6ad1a15c64f003f311fda418bce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://33f82b993e3d729a5830e88136e74687c930d6ad1a15c64f003f311fda418bce\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:08:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b640c26797535db11051d50aefd13b7b2759cfa19d163f53764b0fe8710cefdc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b640c26797535db11051d50aefd13b7b2759cfa19d163f53764b0fe8710cefdc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:08:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:08:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-mt7bq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:36Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:36 crc kubenswrapper[4816]: I0316 00:08:36.451750 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:36 crc kubenswrapper[4816]: I0316 00:08:36.451779 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:36 crc kubenswrapper[4816]: I0316 00:08:36.451789 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:36 crc kubenswrapper[4816]: I0316 00:08:36.451806 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:36 crc kubenswrapper[4816]: I0316 00:08:36.451823 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:36Z","lastTransitionTime":"2026-03-16T00:08:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:36 crc kubenswrapper[4816]: I0316 00:08:36.555240 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:36 crc kubenswrapper[4816]: I0316 00:08:36.555297 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:36 crc kubenswrapper[4816]: I0316 00:08:36.555312 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:36 crc kubenswrapper[4816]: I0316 00:08:36.555330 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:36 crc kubenswrapper[4816]: I0316 00:08:36.555348 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:36Z","lastTransitionTime":"2026-03-16T00:08:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:36 crc kubenswrapper[4816]: I0316 00:08:36.658271 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:36 crc kubenswrapper[4816]: I0316 00:08:36.658316 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:36 crc kubenswrapper[4816]: I0316 00:08:36.658328 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:36 crc kubenswrapper[4816]: I0316 00:08:36.658345 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:36 crc kubenswrapper[4816]: I0316 00:08:36.658355 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:36Z","lastTransitionTime":"2026-03-16T00:08:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:36 crc kubenswrapper[4816]: I0316 00:08:36.666858 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 16 00:08:36 crc kubenswrapper[4816]: I0316 00:08:36.666883 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 16 00:08:36 crc kubenswrapper[4816]: E0316 00:08:36.666985 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 16 00:08:36 crc kubenswrapper[4816]: I0316 00:08:36.667052 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 16 00:08:36 crc kubenswrapper[4816]: E0316 00:08:36.667325 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 16 00:08:36 crc kubenswrapper[4816]: E0316 00:08:36.667399 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 16 00:08:36 crc kubenswrapper[4816]: I0316 00:08:36.760966 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:36 crc kubenswrapper[4816]: I0316 00:08:36.761028 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:36 crc kubenswrapper[4816]: I0316 00:08:36.761044 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:36 crc kubenswrapper[4816]: I0316 00:08:36.761070 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:36 crc kubenswrapper[4816]: I0316 00:08:36.761087 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:36Z","lastTransitionTime":"2026-03-16T00:08:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:36 crc kubenswrapper[4816]: I0316 00:08:36.864860 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:36 crc kubenswrapper[4816]: I0316 00:08:36.864903 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:36 crc kubenswrapper[4816]: I0316 00:08:36.864923 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:36 crc kubenswrapper[4816]: I0316 00:08:36.864948 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:36 crc kubenswrapper[4816]: I0316 00:08:36.864964 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:36Z","lastTransitionTime":"2026-03-16T00:08:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:36 crc kubenswrapper[4816]: I0316 00:08:36.967538 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:36 crc kubenswrapper[4816]: I0316 00:08:36.967612 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:36 crc kubenswrapper[4816]: I0316 00:08:36.967626 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:36 crc kubenswrapper[4816]: I0316 00:08:36.967650 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:36 crc kubenswrapper[4816]: I0316 00:08:36.967663 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:36Z","lastTransitionTime":"2026-03-16T00:08:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:37 crc kubenswrapper[4816]: I0316 00:08:37.071310 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:37 crc kubenswrapper[4816]: I0316 00:08:37.071388 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:37 crc kubenswrapper[4816]: I0316 00:08:37.071403 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:37 crc kubenswrapper[4816]: I0316 00:08:37.071423 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:37 crc kubenswrapper[4816]: I0316 00:08:37.071439 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:37Z","lastTransitionTime":"2026-03-16T00:08:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:37 crc kubenswrapper[4816]: I0316 00:08:37.173676 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:37 crc kubenswrapper[4816]: I0316 00:08:37.173746 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:37 crc kubenswrapper[4816]: I0316 00:08:37.173773 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:37 crc kubenswrapper[4816]: I0316 00:08:37.173803 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:37 crc kubenswrapper[4816]: I0316 00:08:37.173828 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:37Z","lastTransitionTime":"2026-03-16T00:08:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:37 crc kubenswrapper[4816]: I0316 00:08:37.177098 4816 generic.go:334] "Generic (PLEG): container finished" podID="03ef49f1-0c6a-443a-8df3-2db339c562ed" containerID="fe5359762a36b39c9f8ae939a1b9d094c42ffab59c4b9743a83b4391c171a512" exitCode=0 Mar 16 00:08:37 crc kubenswrapper[4816]: I0316 00:08:37.177227 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-mt7bq" event={"ID":"03ef49f1-0c6a-443a-8df3-2db339c562ed","Type":"ContainerDied","Data":"fe5359762a36b39c9f8ae939a1b9d094c42ffab59c4b9743a83b4391c171a512"} Mar 16 00:08:37 crc kubenswrapper[4816]: I0316 00:08:37.184659 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-psjs7" event={"ID":"2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88","Type":"ContainerStarted","Data":"4d6bf581280690c99ddbbecd4cf4e215a12230d377978aa48acd3c05b85a3c56"} Mar 16 00:08:37 crc kubenswrapper[4816]: I0316 00:08:37.217084 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c8786d1-025d-4788-bafe-c2c2eaf8e398\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://46db23088e8ba791d736f70a65bd066af80a5ec27c31332d9ac9c52530b21bfd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://da7b257af12b9ee605dc32a26d9565f866a9cafebb4936a0bde7f42ae9909f80\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2523ea43f2490afb81354be8a2840f54a3be1a8d3154196e0c8ac365d09723a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://29c4319e41bd025417b11cb87b682a66698ba7575801f247b1192dc8067ab66f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://29c4319e41bd025417b11cb87b682a66698ba7575801f247b1192dc8067ab66f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-16T00:07:59Z\\\",\\\"message\\\":\\\"le observer\\\\nW0316 00:07:59.373922 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0316 00:07:59.374075 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0316 00:07:59.374860 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2696044728/tls.crt::/tmp/serving-cert-2696044728/tls.key\\\\\\\"\\\\nI0316 00:07:59.560928 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0316 00:07:59.563996 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0316 00:07:59.564013 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0316 00:07:59.564035 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0316 00:07:59.564041 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0316 00:07:59.571475 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0316 00:07:59.571523 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0316 00:07:59.571531 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0316 00:07:59.571540 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0316 00:07:59.571574 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0316 00:07:59.571588 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nI0316 00:07:59.571493 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0316 00:07:59.571597 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0316 00:07:59.573484 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-16T00:07:58Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c81d0c40be9c3912864280cd98481f837ef29f69d85599b39decf5e9d7a97eff\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:50Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e39d816de31fa1c42a254ec61527496e7e85121f67007a774524167b1063cf1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e39d816de31fa1c42a254ec61527496e7e85121f67007a774524167b1063cf1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:06:47Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:37Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:37 crc kubenswrapper[4816]: I0316 00:08:37.236384 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://516852108b3d17a15676573ff080d3d94fa48d8076ed0404a8d827e715c2555e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:37Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:37 crc kubenswrapper[4816]: I0316 00:08:37.254419 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:37Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:37 crc kubenswrapper[4816]: I0316 00:08:37.272691 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-mt7bq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"03ef49f1-0c6a-443a-8df3-2db339c562ed\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8979f751ec4797986924a3efea84d02bc1c1cf303e1e992f349c9083f2fe4f08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8979f751ec4797986924a3efea84d02bc1c1cf303e1e992f349c9083f2fe4f08\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa87691b0dc314209d88d1f4d5b2227abc1de527e8111831bb8243ba6192c300\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fa87691b0dc314209d88d1f4d5b2227abc1de527e8111831bb8243ba6192c300\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:08:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://33f82b993e3d729a5830e88136e74687c930d6ad1a15c64f003f311fda418bce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://33f82b993e3d729a5830e88136e74687c930d6ad1a15c64f003f311fda418bce\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:08:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b640c26797535db11051d50aefd13b7b2759cfa19d163f53764b0fe8710cefdc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b640c26797535db11051d50aefd13b7b2759cfa19d163f53764b0fe8710cefdc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:08:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe5359762a36b39c9f8ae939a1b9d094c42ffab59c4b9743a83b4391c171a512\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fe5359762a36b39c9f8ae939a1b9d094c42ffab59c4b9743a83b4391c171a512\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:08:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:08:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-mt7bq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:37Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:37 crc kubenswrapper[4816]: I0316 00:08:37.276416 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:37 crc kubenswrapper[4816]: I0316 00:08:37.276446 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:37 crc kubenswrapper[4816]: I0316 00:08:37.276458 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:37 crc kubenswrapper[4816]: I0316 00:08:37.276475 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:37 crc kubenswrapper[4816]: I0316 00:08:37.276487 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:37Z","lastTransitionTime":"2026-03-16T00:08:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:37 crc kubenswrapper[4816]: I0316 00:08:37.287738 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:37Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:37 crc kubenswrapper[4816]: I0316 00:08:37.306924 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:37Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:37 crc kubenswrapper[4816]: I0316 00:08:37.329857 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"49e35ef6-7a6d-438f-b598-4445d73db379\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fbe90b49c5d42bb9d6d5d1b8f1d02b571403dd3f39d3118081351d72c4910914\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://508e6dbe6f2f1392d16501e09ee2ec523a3c2a245cd96c6ab773e26e83b8bb6a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://15b5a3223ff0fb5b96e5b87ee836add66498165759f4d6ed8cb6413f091967e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://603bdcbd4a6af5fda7ca09e4823e0db8d7ec3e3eb38eddf9f062a14b433cb094\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8b061f16a65a27b9a5ef66231376b70fec084479a991b7fdd430957df76da32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b19c46be8bb0c6604bb4f702d14472b782280ce716a3b5fe87a4fa4f1b15b174\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b19c46be8bb0c6604bb4f702d14472b782280ce716a3b5fe87a4fa4f1b15b174\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://baceacd0418c0bbcbd2e875008286443b263336ba956d8e4584147ed949b3f69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://baceacd0418c0bbcbd2e875008286443b263336ba956d8e4584147ed949b3f69\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:49Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://24e7adc2ddce8a281cd270db47b3528d2a4965ab6d41a14ecb5aaea02daf0c45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://24e7adc2ddce8a281cd270db47b3528d2a4965ab6d41a14ecb5aaea02daf0c45\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:06:47Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:37Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:37 crc kubenswrapper[4816]: I0316 00:08:37.343488 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-szscw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e9789e58-12c8-4831-9401-af48a3e92209\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e6f0b7e8f95bbbf3b90133349b8b13d2784582a5d95dae8dd159328b570ae962\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mxf6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:08:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-szscw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:37Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:37 crc kubenswrapper[4816]: I0316 00:08:37.356515 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-jrdcz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dd08ece2-7636-4966-973a-e96a34b70b53\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6dd22d6548986bf126a8ce6b30c7c11d8c37bd0c03539a55cc28fb7debcfc26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trwbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7003a4592a48f87ab63e9f5637c7665040f9784a7c8f599c82ed61270ddb793c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trwbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:08:31Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-jrdcz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:37Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:37 crc kubenswrapper[4816]: I0316 00:08:37.374191 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-psjs7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1249f8b1c01db98230c10bde49bdb68f17caccc1ba0546e462df4384ca1658dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1249f8b1c01db98230c10bde49bdb68f17caccc1ba0546e462df4384ca1658dd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:08:31Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-psjs7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:37Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:37 crc kubenswrapper[4816]: I0316 00:08:37.379062 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:37 crc kubenswrapper[4816]: I0316 00:08:37.379111 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:37 crc kubenswrapper[4816]: I0316 00:08:37.379125 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:37 crc kubenswrapper[4816]: I0316 00:08:37.379143 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:37 crc kubenswrapper[4816]: I0316 00:08:37.379156 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:37Z","lastTransitionTime":"2026-03-16T00:08:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:37 crc kubenswrapper[4816]: I0316 00:08:37.386520 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://723654bc91ce4d40a27b14a9c4df1ed6c8dff2517f378927d7a6a96aeaa6951b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:37Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:37 crc kubenswrapper[4816]: I0316 00:08:37.397479 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://37869baff974556c22aadadfa4b528b5d6b9ad236107b4dbc945b3cdbf743f20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f9fe727708fdaca68596b7d305629ade45b061a88a81085f885d490ab99e24b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:37Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:37 crc kubenswrapper[4816]: I0316 00:08:37.407028 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-cnhkf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e686cd4-bddf-463e-b471-e49ea862691e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f6c3cacd4d7f6866e66b7234b280540ef2c443531de7a11f6894323ef24a9fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bwzt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:08:31Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-cnhkf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:37Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:37 crc kubenswrapper[4816]: I0316 00:08:37.481499 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:37 crc kubenswrapper[4816]: I0316 00:08:37.481541 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:37 crc kubenswrapper[4816]: I0316 00:08:37.481568 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:37 crc kubenswrapper[4816]: I0316 00:08:37.481584 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:37 crc kubenswrapper[4816]: I0316 00:08:37.481594 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:37Z","lastTransitionTime":"2026-03-16T00:08:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:37 crc kubenswrapper[4816]: I0316 00:08:37.584811 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:37 crc kubenswrapper[4816]: I0316 00:08:37.584890 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:37 crc kubenswrapper[4816]: I0316 00:08:37.584902 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:37 crc kubenswrapper[4816]: I0316 00:08:37.584917 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:37 crc kubenswrapper[4816]: I0316 00:08:37.584928 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:37Z","lastTransitionTime":"2026-03-16T00:08:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:37 crc kubenswrapper[4816]: I0316 00:08:37.682803 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c8786d1-025d-4788-bafe-c2c2eaf8e398\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://46db23088e8ba791d736f70a65bd066af80a5ec27c31332d9ac9c52530b21bfd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://da7b257af12b9ee605dc32a26d9565f866a9cafebb4936a0bde7f42ae9909f80\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2523ea43f2490afb81354be8a2840f54a3be1a8d3154196e0c8ac365d09723a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://29c4319e41bd025417b11cb87b682a66698ba7575801f247b1192dc8067ab66f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://29c4319e41bd025417b11cb87b682a66698ba7575801f247b1192dc8067ab66f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-16T00:07:59Z\\\",\\\"message\\\":\\\"le observer\\\\nW0316 00:07:59.373922 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0316 00:07:59.374075 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0316 00:07:59.374860 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2696044728/tls.crt::/tmp/serving-cert-2696044728/tls.key\\\\\\\"\\\\nI0316 00:07:59.560928 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0316 00:07:59.563996 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0316 00:07:59.564013 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0316 00:07:59.564035 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0316 00:07:59.564041 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0316 00:07:59.571475 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0316 00:07:59.571523 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0316 00:07:59.571531 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0316 00:07:59.571540 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0316 00:07:59.571574 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0316 00:07:59.571588 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nI0316 00:07:59.571493 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0316 00:07:59.571597 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0316 00:07:59.573484 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-16T00:07:58Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c81d0c40be9c3912864280cd98481f837ef29f69d85599b39decf5e9d7a97eff\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:50Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e39d816de31fa1c42a254ec61527496e7e85121f67007a774524167b1063cf1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e39d816de31fa1c42a254ec61527496e7e85121f67007a774524167b1063cf1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:06:47Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:37Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:37 crc kubenswrapper[4816]: I0316 00:08:37.687436 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:37 crc kubenswrapper[4816]: I0316 00:08:37.687501 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:37 crc kubenswrapper[4816]: I0316 00:08:37.687526 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:37 crc kubenswrapper[4816]: I0316 00:08:37.687585 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:37 crc kubenswrapper[4816]: I0316 00:08:37.687610 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:37Z","lastTransitionTime":"2026-03-16T00:08:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:37 crc kubenswrapper[4816]: I0316 00:08:37.700159 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://516852108b3d17a15676573ff080d3d94fa48d8076ed0404a8d827e715c2555e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:37Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:37 crc kubenswrapper[4816]: I0316 00:08:37.716270 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:37Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:37 crc kubenswrapper[4816]: I0316 00:08:37.734947 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-mt7bq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"03ef49f1-0c6a-443a-8df3-2db339c562ed\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8979f751ec4797986924a3efea84d02bc1c1cf303e1e992f349c9083f2fe4f08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8979f751ec4797986924a3efea84d02bc1c1cf303e1e992f349c9083f2fe4f08\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa87691b0dc314209d88d1f4d5b2227abc1de527e8111831bb8243ba6192c300\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fa87691b0dc314209d88d1f4d5b2227abc1de527e8111831bb8243ba6192c300\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:08:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://33f82b993e3d729a5830e88136e74687c930d6ad1a15c64f003f311fda418bce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://33f82b993e3d729a5830e88136e74687c930d6ad1a15c64f003f311fda418bce\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:08:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b640c26797535db11051d50aefd13b7b2759cfa19d163f53764b0fe8710cefdc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b640c26797535db11051d50aefd13b7b2759cfa19d163f53764b0fe8710cefdc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:08:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe5359762a36b39c9f8ae939a1b9d094c42ffab59c4b9743a83b4391c171a512\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fe5359762a36b39c9f8ae939a1b9d094c42ffab59c4b9743a83b4391c171a512\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:08:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:08:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-mt7bq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:37Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:37 crc kubenswrapper[4816]: I0316 00:08:37.750510 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:37Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:37 crc kubenswrapper[4816]: I0316 00:08:37.763730 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:37Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:37 crc kubenswrapper[4816]: I0316 00:08:37.789807 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:37 crc kubenswrapper[4816]: I0316 00:08:37.789851 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:37 crc kubenswrapper[4816]: I0316 00:08:37.789862 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:37 crc kubenswrapper[4816]: I0316 00:08:37.789883 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:37 crc kubenswrapper[4816]: I0316 00:08:37.789897 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:37Z","lastTransitionTime":"2026-03-16T00:08:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:37 crc kubenswrapper[4816]: I0316 00:08:37.798658 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"49e35ef6-7a6d-438f-b598-4445d73db379\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fbe90b49c5d42bb9d6d5d1b8f1d02b571403dd3f39d3118081351d72c4910914\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://508e6dbe6f2f1392d16501e09ee2ec523a3c2a245cd96c6ab773e26e83b8bb6a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://15b5a3223ff0fb5b96e5b87ee836add66498165759f4d6ed8cb6413f091967e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://603bdcbd4a6af5fda7ca09e4823e0db8d7ec3e3eb38eddf9f062a14b433cb094\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8b061f16a65a27b9a5ef66231376b70fec084479a991b7fdd430957df76da32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b19c46be8bb0c6604bb4f702d14472b782280ce716a3b5fe87a4fa4f1b15b174\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b19c46be8bb0c6604bb4f702d14472b782280ce716a3b5fe87a4fa4f1b15b174\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://baceacd0418c0bbcbd2e875008286443b263336ba956d8e4584147ed949b3f69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://baceacd0418c0bbcbd2e875008286443b263336ba956d8e4584147ed949b3f69\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:49Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://24e7adc2ddce8a281cd270db47b3528d2a4965ab6d41a14ecb5aaea02daf0c45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://24e7adc2ddce8a281cd270db47b3528d2a4965ab6d41a14ecb5aaea02daf0c45\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:06:47Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:37Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:37 crc kubenswrapper[4816]: I0316 00:08:37.813815 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-szscw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e9789e58-12c8-4831-9401-af48a3e92209\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e6f0b7e8f95bbbf3b90133349b8b13d2784582a5d95dae8dd159328b570ae962\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mxf6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:08:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-szscw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:37Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:37 crc kubenswrapper[4816]: I0316 00:08:37.831421 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-jrdcz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dd08ece2-7636-4966-973a-e96a34b70b53\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6dd22d6548986bf126a8ce6b30c7c11d8c37bd0c03539a55cc28fb7debcfc26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trwbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7003a4592a48f87ab63e9f5637c7665040f9784a7c8f599c82ed61270ddb793c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trwbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:08:31Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-jrdcz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:37Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:37 crc kubenswrapper[4816]: I0316 00:08:37.859100 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-psjs7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1249f8b1c01db98230c10bde49bdb68f17caccc1ba0546e462df4384ca1658dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1249f8b1c01db98230c10bde49bdb68f17caccc1ba0546e462df4384ca1658dd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:08:31Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-psjs7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:37Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:37 crc kubenswrapper[4816]: I0316 00:08:37.872524 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://723654bc91ce4d40a27b14a9c4df1ed6c8dff2517f378927d7a6a96aeaa6951b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:37Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:37 crc kubenswrapper[4816]: I0316 00:08:37.891495 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:37 crc kubenswrapper[4816]: I0316 00:08:37.891594 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:37 crc kubenswrapper[4816]: I0316 00:08:37.891617 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:37 crc kubenswrapper[4816]: I0316 00:08:37.891644 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:37 crc kubenswrapper[4816]: I0316 00:08:37.891663 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:37Z","lastTransitionTime":"2026-03-16T00:08:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:37 crc kubenswrapper[4816]: I0316 00:08:37.893668 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://37869baff974556c22aadadfa4b528b5d6b9ad236107b4dbc945b3cdbf743f20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f9fe727708fdaca68596b7d305629ade45b061a88a81085f885d490ab99e24b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:37Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:37 crc kubenswrapper[4816]: I0316 00:08:37.910428 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-cnhkf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e686cd4-bddf-463e-b471-e49ea862691e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f6c3cacd4d7f6866e66b7234b280540ef2c443531de7a11f6894323ef24a9fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bwzt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:08:31Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-cnhkf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:37Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:37 crc kubenswrapper[4816]: I0316 00:08:37.994372 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:37 crc kubenswrapper[4816]: I0316 00:08:37.994406 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:37 crc kubenswrapper[4816]: I0316 00:08:37.994415 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:37 crc kubenswrapper[4816]: I0316 00:08:37.994428 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:37 crc kubenswrapper[4816]: I0316 00:08:37.994437 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:37Z","lastTransitionTime":"2026-03-16T00:08:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:38 crc kubenswrapper[4816]: I0316 00:08:38.097281 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:38 crc kubenswrapper[4816]: I0316 00:08:38.097349 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:38 crc kubenswrapper[4816]: I0316 00:08:38.097371 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:38 crc kubenswrapper[4816]: I0316 00:08:38.097402 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:38 crc kubenswrapper[4816]: I0316 00:08:38.097427 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:38Z","lastTransitionTime":"2026-03-16T00:08:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:38 crc kubenswrapper[4816]: I0316 00:08:38.194808 4816 generic.go:334] "Generic (PLEG): container finished" podID="03ef49f1-0c6a-443a-8df3-2db339c562ed" containerID="dc31f057b776cf1925d265b64c7a9a9fb0993e4f3ffff530a48f28865e1c2954" exitCode=0 Mar 16 00:08:38 crc kubenswrapper[4816]: I0316 00:08:38.194914 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-mt7bq" event={"ID":"03ef49f1-0c6a-443a-8df3-2db339c562ed","Type":"ContainerDied","Data":"dc31f057b776cf1925d265b64c7a9a9fb0993e4f3ffff530a48f28865e1c2954"} Mar 16 00:08:38 crc kubenswrapper[4816]: I0316 00:08:38.200878 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:38 crc kubenswrapper[4816]: I0316 00:08:38.200951 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:38 crc kubenswrapper[4816]: I0316 00:08:38.200982 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:38 crc kubenswrapper[4816]: I0316 00:08:38.201013 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:38 crc kubenswrapper[4816]: I0316 00:08:38.201039 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:38Z","lastTransitionTime":"2026-03-16T00:08:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:38 crc kubenswrapper[4816]: I0316 00:08:38.219699 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:38Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:38 crc kubenswrapper[4816]: I0316 00:08:38.242339 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:38Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:38 crc kubenswrapper[4816]: I0316 00:08:38.274484 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"49e35ef6-7a6d-438f-b598-4445d73db379\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fbe90b49c5d42bb9d6d5d1b8f1d02b571403dd3f39d3118081351d72c4910914\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://508e6dbe6f2f1392d16501e09ee2ec523a3c2a245cd96c6ab773e26e83b8bb6a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://15b5a3223ff0fb5b96e5b87ee836add66498165759f4d6ed8cb6413f091967e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://603bdcbd4a6af5fda7ca09e4823e0db8d7ec3e3eb38eddf9f062a14b433cb094\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8b061f16a65a27b9a5ef66231376b70fec084479a991b7fdd430957df76da32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b19c46be8bb0c6604bb4f702d14472b782280ce716a3b5fe87a4fa4f1b15b174\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b19c46be8bb0c6604bb4f702d14472b782280ce716a3b5fe87a4fa4f1b15b174\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://baceacd0418c0bbcbd2e875008286443b263336ba956d8e4584147ed949b3f69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://baceacd0418c0bbcbd2e875008286443b263336ba956d8e4584147ed949b3f69\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:49Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://24e7adc2ddce8a281cd270db47b3528d2a4965ab6d41a14ecb5aaea02daf0c45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://24e7adc2ddce8a281cd270db47b3528d2a4965ab6d41a14ecb5aaea02daf0c45\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:06:47Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:38Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:38 crc kubenswrapper[4816]: I0316 00:08:38.301962 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-szscw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e9789e58-12c8-4831-9401-af48a3e92209\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e6f0b7e8f95bbbf3b90133349b8b13d2784582a5d95dae8dd159328b570ae962\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mxf6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:08:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-szscw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:38Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:38 crc kubenswrapper[4816]: I0316 00:08:38.308263 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:38 crc kubenswrapper[4816]: I0316 00:08:38.308319 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:38 crc kubenswrapper[4816]: I0316 00:08:38.308332 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:38 crc kubenswrapper[4816]: I0316 00:08:38.308355 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:38 crc kubenswrapper[4816]: I0316 00:08:38.308374 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:38Z","lastTransitionTime":"2026-03-16T00:08:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:38 crc kubenswrapper[4816]: I0316 00:08:38.322233 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-jrdcz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dd08ece2-7636-4966-973a-e96a34b70b53\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6dd22d6548986bf126a8ce6b30c7c11d8c37bd0c03539a55cc28fb7debcfc26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trwbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7003a4592a48f87ab63e9f5637c7665040f9784a7c8f599c82ed61270ddb793c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trwbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:08:31Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-jrdcz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:38Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:38 crc kubenswrapper[4816]: I0316 00:08:38.347247 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-psjs7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1249f8b1c01db98230c10bde49bdb68f17caccc1ba0546e462df4384ca1658dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1249f8b1c01db98230c10bde49bdb68f17caccc1ba0546e462df4384ca1658dd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:08:31Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-psjs7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:38Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:38 crc kubenswrapper[4816]: I0316 00:08:38.367130 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/node-ca-lhpbn"] Mar 16 00:08:38 crc kubenswrapper[4816]: I0316 00:08:38.367713 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-lhpbn" Mar 16 00:08:38 crc kubenswrapper[4816]: I0316 00:08:38.371191 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Mar 16 00:08:38 crc kubenswrapper[4816]: I0316 00:08:38.371267 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Mar 16 00:08:38 crc kubenswrapper[4816]: I0316 00:08:38.371320 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Mar 16 00:08:38 crc kubenswrapper[4816]: I0316 00:08:38.372788 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Mar 16 00:08:38 crc kubenswrapper[4816]: I0316 00:08:38.378457 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://723654bc91ce4d40a27b14a9c4df1ed6c8dff2517f378927d7a6a96aeaa6951b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:38Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:38 crc kubenswrapper[4816]: I0316 00:08:38.395925 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://37869baff974556c22aadadfa4b528b5d6b9ad236107b4dbc945b3cdbf743f20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f9fe727708fdaca68596b7d305629ade45b061a88a81085f885d490ab99e24b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:38Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:38 crc kubenswrapper[4816]: I0316 00:08:38.407399 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-cnhkf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e686cd4-bddf-463e-b471-e49ea862691e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f6c3cacd4d7f6866e66b7234b280540ef2c443531de7a11f6894323ef24a9fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bwzt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:08:31Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-cnhkf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:38Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:38 crc kubenswrapper[4816]: I0316 00:08:38.411691 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:38 crc kubenswrapper[4816]: I0316 00:08:38.411731 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:38 crc kubenswrapper[4816]: I0316 00:08:38.411748 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:38 crc kubenswrapper[4816]: I0316 00:08:38.411773 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:38 crc kubenswrapper[4816]: I0316 00:08:38.411792 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:38Z","lastTransitionTime":"2026-03-16T00:08:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:38 crc kubenswrapper[4816]: I0316 00:08:38.422685 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c8786d1-025d-4788-bafe-c2c2eaf8e398\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://46db23088e8ba791d736f70a65bd066af80a5ec27c31332d9ac9c52530b21bfd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://da7b257af12b9ee605dc32a26d9565f866a9cafebb4936a0bde7f42ae9909f80\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2523ea43f2490afb81354be8a2840f54a3be1a8d3154196e0c8ac365d09723a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://29c4319e41bd025417b11cb87b682a66698ba7575801f247b1192dc8067ab66f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://29c4319e41bd025417b11cb87b682a66698ba7575801f247b1192dc8067ab66f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-16T00:07:59Z\\\",\\\"message\\\":\\\"le observer\\\\nW0316 00:07:59.373922 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0316 00:07:59.374075 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0316 00:07:59.374860 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2696044728/tls.crt::/tmp/serving-cert-2696044728/tls.key\\\\\\\"\\\\nI0316 00:07:59.560928 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0316 00:07:59.563996 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0316 00:07:59.564013 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0316 00:07:59.564035 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0316 00:07:59.564041 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0316 00:07:59.571475 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0316 00:07:59.571523 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0316 00:07:59.571531 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0316 00:07:59.571540 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0316 00:07:59.571574 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0316 00:07:59.571588 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nI0316 00:07:59.571493 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0316 00:07:59.571597 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0316 00:07:59.573484 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-16T00:07:58Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c81d0c40be9c3912864280cd98481f837ef29f69d85599b39decf5e9d7a97eff\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:50Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e39d816de31fa1c42a254ec61527496e7e85121f67007a774524167b1063cf1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e39d816de31fa1c42a254ec61527496e7e85121f67007a774524167b1063cf1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:06:47Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:38Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:38 crc kubenswrapper[4816]: I0316 00:08:38.435776 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://516852108b3d17a15676573ff080d3d94fa48d8076ed0404a8d827e715c2555e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:38Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:38 crc kubenswrapper[4816]: I0316 00:08:38.450306 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:38Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:38 crc kubenswrapper[4816]: I0316 00:08:38.466265 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-mt7bq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"03ef49f1-0c6a-443a-8df3-2db339c562ed\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8979f751ec4797986924a3efea84d02bc1c1cf303e1e992f349c9083f2fe4f08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8979f751ec4797986924a3efea84d02bc1c1cf303e1e992f349c9083f2fe4f08\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa87691b0dc314209d88d1f4d5b2227abc1de527e8111831bb8243ba6192c300\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fa87691b0dc314209d88d1f4d5b2227abc1de527e8111831bb8243ba6192c300\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:08:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://33f82b993e3d729a5830e88136e74687c930d6ad1a15c64f003f311fda418bce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://33f82b993e3d729a5830e88136e74687c930d6ad1a15c64f003f311fda418bce\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:08:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b640c26797535db11051d50aefd13b7b2759cfa19d163f53764b0fe8710cefdc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b640c26797535db11051d50aefd13b7b2759cfa19d163f53764b0fe8710cefdc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:08:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe5359762a36b39c9f8ae939a1b9d094c42ffab59c4b9743a83b4391c171a512\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fe5359762a36b39c9f8ae939a1b9d094c42ffab59c4b9743a83b4391c171a512\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:08:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc31f057b776cf1925d265b64c7a9a9fb0993e4f3ffff530a48f28865e1c2954\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dc31f057b776cf1925d265b64c7a9a9fb0993e4f3ffff530a48f28865e1c2954\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:08:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:08:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-mt7bq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:38Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:38 crc kubenswrapper[4816]: I0316 00:08:38.474300 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c2x8m\" (UniqueName: \"kubernetes.io/projected/6ec6a8ee-efd9-45df-bb35-706fcc90ebe9-kube-api-access-c2x8m\") pod \"node-ca-lhpbn\" (UID: \"6ec6a8ee-efd9-45df-bb35-706fcc90ebe9\") " pod="openshift-image-registry/node-ca-lhpbn" Mar 16 00:08:38 crc kubenswrapper[4816]: I0316 00:08:38.474407 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/6ec6a8ee-efd9-45df-bb35-706fcc90ebe9-host\") pod \"node-ca-lhpbn\" (UID: \"6ec6a8ee-efd9-45df-bb35-706fcc90ebe9\") " pod="openshift-image-registry/node-ca-lhpbn" Mar 16 00:08:38 crc kubenswrapper[4816]: I0316 00:08:38.474448 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/6ec6a8ee-efd9-45df-bb35-706fcc90ebe9-serviceca\") pod \"node-ca-lhpbn\" (UID: \"6ec6a8ee-efd9-45df-bb35-706fcc90ebe9\") " pod="openshift-image-registry/node-ca-lhpbn" Mar 16 00:08:38 crc kubenswrapper[4816]: I0316 00:08:38.480657 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:38Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:38 crc kubenswrapper[4816]: I0316 00:08:38.496827 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:38Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:38 crc kubenswrapper[4816]: I0316 00:08:38.514782 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:38 crc kubenswrapper[4816]: I0316 00:08:38.514821 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:38 crc kubenswrapper[4816]: I0316 00:08:38.514832 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:38 crc kubenswrapper[4816]: I0316 00:08:38.514851 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:38 crc kubenswrapper[4816]: I0316 00:08:38.514865 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:38Z","lastTransitionTime":"2026-03-16T00:08:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:38 crc kubenswrapper[4816]: I0316 00:08:38.521330 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"49e35ef6-7a6d-438f-b598-4445d73db379\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fbe90b49c5d42bb9d6d5d1b8f1d02b571403dd3f39d3118081351d72c4910914\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://508e6dbe6f2f1392d16501e09ee2ec523a3c2a245cd96c6ab773e26e83b8bb6a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://15b5a3223ff0fb5b96e5b87ee836add66498165759f4d6ed8cb6413f091967e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://603bdcbd4a6af5fda7ca09e4823e0db8d7ec3e3eb38eddf9f062a14b433cb094\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8b061f16a65a27b9a5ef66231376b70fec084479a991b7fdd430957df76da32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b19c46be8bb0c6604bb4f702d14472b782280ce716a3b5fe87a4fa4f1b15b174\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b19c46be8bb0c6604bb4f702d14472b782280ce716a3b5fe87a4fa4f1b15b174\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://baceacd0418c0bbcbd2e875008286443b263336ba956d8e4584147ed949b3f69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://baceacd0418c0bbcbd2e875008286443b263336ba956d8e4584147ed949b3f69\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:49Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://24e7adc2ddce8a281cd270db47b3528d2a4965ab6d41a14ecb5aaea02daf0c45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://24e7adc2ddce8a281cd270db47b3528d2a4965ab6d41a14ecb5aaea02daf0c45\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:06:47Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:38Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:38 crc kubenswrapper[4816]: I0316 00:08:38.540061 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-szscw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e9789e58-12c8-4831-9401-af48a3e92209\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e6f0b7e8f95bbbf3b90133349b8b13d2784582a5d95dae8dd159328b570ae962\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mxf6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:08:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-szscw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:38Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:38 crc kubenswrapper[4816]: I0316 00:08:38.555753 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-jrdcz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dd08ece2-7636-4966-973a-e96a34b70b53\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6dd22d6548986bf126a8ce6b30c7c11d8c37bd0c03539a55cc28fb7debcfc26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trwbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7003a4592a48f87ab63e9f5637c7665040f9784a7c8f599c82ed61270ddb793c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trwbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:08:31Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-jrdcz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:38Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:38 crc kubenswrapper[4816]: I0316 00:08:38.576027 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/6ec6a8ee-efd9-45df-bb35-706fcc90ebe9-host\") pod \"node-ca-lhpbn\" (UID: \"6ec6a8ee-efd9-45df-bb35-706fcc90ebe9\") " pod="openshift-image-registry/node-ca-lhpbn" Mar 16 00:08:38 crc kubenswrapper[4816]: I0316 00:08:38.576075 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/6ec6a8ee-efd9-45df-bb35-706fcc90ebe9-serviceca\") pod \"node-ca-lhpbn\" (UID: \"6ec6a8ee-efd9-45df-bb35-706fcc90ebe9\") " pod="openshift-image-registry/node-ca-lhpbn" Mar 16 00:08:38 crc kubenswrapper[4816]: I0316 00:08:38.576143 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c2x8m\" (UniqueName: \"kubernetes.io/projected/6ec6a8ee-efd9-45df-bb35-706fcc90ebe9-kube-api-access-c2x8m\") pod \"node-ca-lhpbn\" (UID: \"6ec6a8ee-efd9-45df-bb35-706fcc90ebe9\") " pod="openshift-image-registry/node-ca-lhpbn" Mar 16 00:08:38 crc kubenswrapper[4816]: I0316 00:08:38.576200 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/6ec6a8ee-efd9-45df-bb35-706fcc90ebe9-host\") pod \"node-ca-lhpbn\" (UID: \"6ec6a8ee-efd9-45df-bb35-706fcc90ebe9\") " pod="openshift-image-registry/node-ca-lhpbn" Mar 16 00:08:38 crc kubenswrapper[4816]: I0316 00:08:38.577896 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/6ec6a8ee-efd9-45df-bb35-706fcc90ebe9-serviceca\") pod \"node-ca-lhpbn\" (UID: \"6ec6a8ee-efd9-45df-bb35-706fcc90ebe9\") " pod="openshift-image-registry/node-ca-lhpbn" Mar 16 00:08:38 crc kubenswrapper[4816]: I0316 00:08:38.581806 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-psjs7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1249f8b1c01db98230c10bde49bdb68f17caccc1ba0546e462df4384ca1658dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1249f8b1c01db98230c10bde49bdb68f17caccc1ba0546e462df4384ca1658dd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:08:31Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-psjs7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:38Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:38 crc kubenswrapper[4816]: I0316 00:08:38.594074 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c2x8m\" (UniqueName: \"kubernetes.io/projected/6ec6a8ee-efd9-45df-bb35-706fcc90ebe9-kube-api-access-c2x8m\") pod \"node-ca-lhpbn\" (UID: \"6ec6a8ee-efd9-45df-bb35-706fcc90ebe9\") " pod="openshift-image-registry/node-ca-lhpbn" Mar 16 00:08:38 crc kubenswrapper[4816]: I0316 00:08:38.597889 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://723654bc91ce4d40a27b14a9c4df1ed6c8dff2517f378927d7a6a96aeaa6951b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:38Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:38 crc kubenswrapper[4816]: I0316 00:08:38.613057 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://37869baff974556c22aadadfa4b528b5d6b9ad236107b4dbc945b3cdbf743f20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f9fe727708fdaca68596b7d305629ade45b061a88a81085f885d490ab99e24b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:38Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:38 crc kubenswrapper[4816]: I0316 00:08:38.617290 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:38 crc kubenswrapper[4816]: I0316 00:08:38.617343 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:38 crc kubenswrapper[4816]: I0316 00:08:38.617354 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:38 crc kubenswrapper[4816]: I0316 00:08:38.617376 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:38 crc kubenswrapper[4816]: I0316 00:08:38.617389 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:38Z","lastTransitionTime":"2026-03-16T00:08:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:38 crc kubenswrapper[4816]: I0316 00:08:38.626155 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-cnhkf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e686cd4-bddf-463e-b471-e49ea862691e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f6c3cacd4d7f6866e66b7234b280540ef2c443531de7a11f6894323ef24a9fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bwzt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:08:31Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-cnhkf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:38Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:38 crc kubenswrapper[4816]: I0316 00:08:38.639309 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c8786d1-025d-4788-bafe-c2c2eaf8e398\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://46db23088e8ba791d736f70a65bd066af80a5ec27c31332d9ac9c52530b21bfd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://da7b257af12b9ee605dc32a26d9565f866a9cafebb4936a0bde7f42ae9909f80\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2523ea43f2490afb81354be8a2840f54a3be1a8d3154196e0c8ac365d09723a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://29c4319e41bd025417b11cb87b682a66698ba7575801f247b1192dc8067ab66f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://29c4319e41bd025417b11cb87b682a66698ba7575801f247b1192dc8067ab66f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-16T00:07:59Z\\\",\\\"message\\\":\\\"le observer\\\\nW0316 00:07:59.373922 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0316 00:07:59.374075 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0316 00:07:59.374860 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2696044728/tls.crt::/tmp/serving-cert-2696044728/tls.key\\\\\\\"\\\\nI0316 00:07:59.560928 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0316 00:07:59.563996 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0316 00:07:59.564013 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0316 00:07:59.564035 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0316 00:07:59.564041 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0316 00:07:59.571475 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0316 00:07:59.571523 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0316 00:07:59.571531 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0316 00:07:59.571540 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0316 00:07:59.571574 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0316 00:07:59.571588 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nI0316 00:07:59.571493 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0316 00:07:59.571597 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0316 00:07:59.573484 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-16T00:07:58Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c81d0c40be9c3912864280cd98481f837ef29f69d85599b39decf5e9d7a97eff\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:50Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e39d816de31fa1c42a254ec61527496e7e85121f67007a774524167b1063cf1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e39d816de31fa1c42a254ec61527496e7e85121f67007a774524167b1063cf1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:06:47Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:38Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:38 crc kubenswrapper[4816]: I0316 00:08:38.651815 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://516852108b3d17a15676573ff080d3d94fa48d8076ed0404a8d827e715c2555e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:38Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:38 crc kubenswrapper[4816]: I0316 00:08:38.662945 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:38Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:38 crc kubenswrapper[4816]: I0316 00:08:38.667535 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 16 00:08:38 crc kubenswrapper[4816]: I0316 00:08:38.667710 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 16 00:08:38 crc kubenswrapper[4816]: E0316 00:08:38.667786 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 16 00:08:38 crc kubenswrapper[4816]: E0316 00:08:38.667707 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 16 00:08:38 crc kubenswrapper[4816]: I0316 00:08:38.667811 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 16 00:08:38 crc kubenswrapper[4816]: E0316 00:08:38.668058 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 16 00:08:38 crc kubenswrapper[4816]: I0316 00:08:38.675998 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-mt7bq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"03ef49f1-0c6a-443a-8df3-2db339c562ed\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8979f751ec4797986924a3efea84d02bc1c1cf303e1e992f349c9083f2fe4f08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8979f751ec4797986924a3efea84d02bc1c1cf303e1e992f349c9083f2fe4f08\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa87691b0dc314209d88d1f4d5b2227abc1de527e8111831bb8243ba6192c300\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fa87691b0dc314209d88d1f4d5b2227abc1de527e8111831bb8243ba6192c300\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:08:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://33f82b993e3d729a5830e88136e74687c930d6ad1a15c64f003f311fda418bce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://33f82b993e3d729a5830e88136e74687c930d6ad1a15c64f003f311fda418bce\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:08:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b640c26797535db11051d50aefd13b7b2759cfa19d163f53764b0fe8710cefdc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b640c26797535db11051d50aefd13b7b2759cfa19d163f53764b0fe8710cefdc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:08:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe5359762a36b39c9f8ae939a1b9d094c42ffab59c4b9743a83b4391c171a512\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fe5359762a36b39c9f8ae939a1b9d094c42ffab59c4b9743a83b4391c171a512\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:08:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc31f057b776cf1925d265b64c7a9a9fb0993e4f3ffff530a48f28865e1c2954\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dc31f057b776cf1925d265b64c7a9a9fb0993e4f3ffff530a48f28865e1c2954\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:08:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:08:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-mt7bq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:38Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:38 crc kubenswrapper[4816]: I0316 00:08:38.686049 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-lhpbn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6ec6a8ee-efd9-45df-bb35-706fcc90ebe9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:38Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:38Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c2x8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:08:38Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-lhpbn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:38Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:38 crc kubenswrapper[4816]: I0316 00:08:38.699237 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-lhpbn" Mar 16 00:08:38 crc kubenswrapper[4816]: W0316 00:08:38.711656 4816 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6ec6a8ee_efd9_45df_bb35_706fcc90ebe9.slice/crio-d491dbc9accdecc3ae17cc01a0289ebee30340a6935a058858248641b11f3e8f WatchSource:0}: Error finding container d491dbc9accdecc3ae17cc01a0289ebee30340a6935a058858248641b11f3e8f: Status 404 returned error can't find the container with id d491dbc9accdecc3ae17cc01a0289ebee30340a6935a058858248641b11f3e8f Mar 16 00:08:38 crc kubenswrapper[4816]: I0316 00:08:38.719484 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:38 crc kubenswrapper[4816]: I0316 00:08:38.719527 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:38 crc kubenswrapper[4816]: I0316 00:08:38.719540 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:38 crc kubenswrapper[4816]: I0316 00:08:38.719587 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:38 crc kubenswrapper[4816]: I0316 00:08:38.719608 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:38Z","lastTransitionTime":"2026-03-16T00:08:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:38 crc kubenswrapper[4816]: I0316 00:08:38.822231 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:38 crc kubenswrapper[4816]: I0316 00:08:38.822292 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:38 crc kubenswrapper[4816]: I0316 00:08:38.822305 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:38 crc kubenswrapper[4816]: I0316 00:08:38.822328 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:38 crc kubenswrapper[4816]: I0316 00:08:38.822345 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:38Z","lastTransitionTime":"2026-03-16T00:08:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:38 crc kubenswrapper[4816]: I0316 00:08:38.924674 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:38 crc kubenswrapper[4816]: I0316 00:08:38.924744 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:38 crc kubenswrapper[4816]: I0316 00:08:38.924756 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:38 crc kubenswrapper[4816]: I0316 00:08:38.924798 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:38 crc kubenswrapper[4816]: I0316 00:08:38.924811 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:38Z","lastTransitionTime":"2026-03-16T00:08:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:39 crc kubenswrapper[4816]: I0316 00:08:39.009056 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:39 crc kubenswrapper[4816]: I0316 00:08:39.009122 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:39 crc kubenswrapper[4816]: I0316 00:08:39.009132 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:39 crc kubenswrapper[4816]: I0316 00:08:39.009147 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:39 crc kubenswrapper[4816]: I0316 00:08:39.009175 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:39Z","lastTransitionTime":"2026-03-16T00:08:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:39 crc kubenswrapper[4816]: E0316 00:08:39.023506 4816 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:08:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:08:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:39Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:08:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:08:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:39Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d8fb348c-4907-4c8f-859a-735976530e03\\\",\\\"systemUUID\\\":\\\"e97abf98-298f-4589-a70a-4cfb5cb2994a\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:39Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:39 crc kubenswrapper[4816]: I0316 00:08:39.028060 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:39 crc kubenswrapper[4816]: I0316 00:08:39.028111 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:39 crc kubenswrapper[4816]: I0316 00:08:39.028127 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:39 crc kubenswrapper[4816]: I0316 00:08:39.028148 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:39 crc kubenswrapper[4816]: I0316 00:08:39.028162 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:39Z","lastTransitionTime":"2026-03-16T00:08:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:39 crc kubenswrapper[4816]: E0316 00:08:39.040479 4816 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:08:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:08:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:39Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:08:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:08:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:39Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d8fb348c-4907-4c8f-859a-735976530e03\\\",\\\"systemUUID\\\":\\\"e97abf98-298f-4589-a70a-4cfb5cb2994a\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:39Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:39 crc kubenswrapper[4816]: I0316 00:08:39.046273 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:39 crc kubenswrapper[4816]: I0316 00:08:39.046350 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:39 crc kubenswrapper[4816]: I0316 00:08:39.046369 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:39 crc kubenswrapper[4816]: I0316 00:08:39.046398 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:39 crc kubenswrapper[4816]: I0316 00:08:39.046420 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:39Z","lastTransitionTime":"2026-03-16T00:08:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:39 crc kubenswrapper[4816]: E0316 00:08:39.062837 4816 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:08:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:08:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:39Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:08:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:08:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:39Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d8fb348c-4907-4c8f-859a-735976530e03\\\",\\\"systemUUID\\\":\\\"e97abf98-298f-4589-a70a-4cfb5cb2994a\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:39Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:39 crc kubenswrapper[4816]: I0316 00:08:39.074860 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:39 crc kubenswrapper[4816]: I0316 00:08:39.074896 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:39 crc kubenswrapper[4816]: I0316 00:08:39.074906 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:39 crc kubenswrapper[4816]: I0316 00:08:39.074923 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:39 crc kubenswrapper[4816]: I0316 00:08:39.074936 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:39Z","lastTransitionTime":"2026-03-16T00:08:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:39 crc kubenswrapper[4816]: E0316 00:08:39.087995 4816 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:08:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:08:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:39Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:08:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:08:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:39Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d8fb348c-4907-4c8f-859a-735976530e03\\\",\\\"systemUUID\\\":\\\"e97abf98-298f-4589-a70a-4cfb5cb2994a\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:39Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:39 crc kubenswrapper[4816]: I0316 00:08:39.092091 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:39 crc kubenswrapper[4816]: I0316 00:08:39.092130 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:39 crc kubenswrapper[4816]: I0316 00:08:39.092143 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:39 crc kubenswrapper[4816]: I0316 00:08:39.092161 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:39 crc kubenswrapper[4816]: I0316 00:08:39.092176 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:39Z","lastTransitionTime":"2026-03-16T00:08:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:39 crc kubenswrapper[4816]: E0316 00:08:39.110983 4816 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:08:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:08:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:39Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:08:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:08:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:39Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d8fb348c-4907-4c8f-859a-735976530e03\\\",\\\"systemUUID\\\":\\\"e97abf98-298f-4589-a70a-4cfb5cb2994a\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:39Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:39 crc kubenswrapper[4816]: E0316 00:08:39.111129 4816 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 16 00:08:39 crc kubenswrapper[4816]: I0316 00:08:39.112728 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:39 crc kubenswrapper[4816]: I0316 00:08:39.112765 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:39 crc kubenswrapper[4816]: I0316 00:08:39.112776 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:39 crc kubenswrapper[4816]: I0316 00:08:39.112793 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:39 crc kubenswrapper[4816]: I0316 00:08:39.112805 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:39Z","lastTransitionTime":"2026-03-16T00:08:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:39 crc kubenswrapper[4816]: I0316 00:08:39.203258 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-mt7bq" event={"ID":"03ef49f1-0c6a-443a-8df3-2db339c562ed","Type":"ContainerStarted","Data":"d31459f93f54cc1cd99638b1f108b8997b34f246cf8c65e5462e672f04aaa7c0"} Mar 16 00:08:39 crc kubenswrapper[4816]: I0316 00:08:39.208799 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-psjs7" event={"ID":"2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88","Type":"ContainerStarted","Data":"a2798f27d9234a85bd45aa5d5d9d6b95668aa8f3ac72933b52bb6b2bc9920651"} Mar 16 00:08:39 crc kubenswrapper[4816]: I0316 00:08:39.209396 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-psjs7" Mar 16 00:08:39 crc kubenswrapper[4816]: I0316 00:08:39.209442 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-psjs7" Mar 16 00:08:39 crc kubenswrapper[4816]: I0316 00:08:39.209625 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-psjs7" Mar 16 00:08:39 crc kubenswrapper[4816]: I0316 00:08:39.211648 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-lhpbn" event={"ID":"6ec6a8ee-efd9-45df-bb35-706fcc90ebe9","Type":"ContainerStarted","Data":"a34b33696ef85102800de34359b2f8b4bf11c03ca6347dde766456bcea20e0e7"} Mar 16 00:08:39 crc kubenswrapper[4816]: I0316 00:08:39.211719 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-lhpbn" event={"ID":"6ec6a8ee-efd9-45df-bb35-706fcc90ebe9","Type":"ContainerStarted","Data":"d491dbc9accdecc3ae17cc01a0289ebee30340a6935a058858248641b11f3e8f"} Mar 16 00:08:39 crc kubenswrapper[4816]: I0316 00:08:39.215466 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:39 crc kubenswrapper[4816]: I0316 00:08:39.215495 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:39 crc kubenswrapper[4816]: I0316 00:08:39.215507 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:39 crc kubenswrapper[4816]: I0316 00:08:39.215522 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:39 crc kubenswrapper[4816]: I0316 00:08:39.215532 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:39Z","lastTransitionTime":"2026-03-16T00:08:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:39 crc kubenswrapper[4816]: I0316 00:08:39.218344 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-lhpbn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6ec6a8ee-efd9-45df-bb35-706fcc90ebe9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:38Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:38Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c2x8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:08:38Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-lhpbn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:39Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:39 crc kubenswrapper[4816]: I0316 00:08:39.235039 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c8786d1-025d-4788-bafe-c2c2eaf8e398\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://46db23088e8ba791d736f70a65bd066af80a5ec27c31332d9ac9c52530b21bfd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://da7b257af12b9ee605dc32a26d9565f866a9cafebb4936a0bde7f42ae9909f80\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2523ea43f2490afb81354be8a2840f54a3be1a8d3154196e0c8ac365d09723a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://29c4319e41bd025417b11cb87b682a66698ba7575801f247b1192dc8067ab66f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://29c4319e41bd025417b11cb87b682a66698ba7575801f247b1192dc8067ab66f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-16T00:07:59Z\\\",\\\"message\\\":\\\"le observer\\\\nW0316 00:07:59.373922 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0316 00:07:59.374075 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0316 00:07:59.374860 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2696044728/tls.crt::/tmp/serving-cert-2696044728/tls.key\\\\\\\"\\\\nI0316 00:07:59.560928 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0316 00:07:59.563996 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0316 00:07:59.564013 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0316 00:07:59.564035 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0316 00:07:59.564041 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0316 00:07:59.571475 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0316 00:07:59.571523 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0316 00:07:59.571531 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0316 00:07:59.571540 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0316 00:07:59.571574 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0316 00:07:59.571588 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nI0316 00:07:59.571493 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0316 00:07:59.571597 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0316 00:07:59.573484 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-16T00:07:58Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c81d0c40be9c3912864280cd98481f837ef29f69d85599b39decf5e9d7a97eff\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:50Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e39d816de31fa1c42a254ec61527496e7e85121f67007a774524167b1063cf1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e39d816de31fa1c42a254ec61527496e7e85121f67007a774524167b1063cf1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:06:47Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:39Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:39 crc kubenswrapper[4816]: I0316 00:08:39.241322 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-psjs7" Mar 16 00:08:39 crc kubenswrapper[4816]: I0316 00:08:39.244232 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-psjs7" Mar 16 00:08:39 crc kubenswrapper[4816]: I0316 00:08:39.250018 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://516852108b3d17a15676573ff080d3d94fa48d8076ed0404a8d827e715c2555e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:39Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:39 crc kubenswrapper[4816]: I0316 00:08:39.264737 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:39Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:39 crc kubenswrapper[4816]: I0316 00:08:39.280601 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-mt7bq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"03ef49f1-0c6a-443a-8df3-2db339c562ed\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d31459f93f54cc1cd99638b1f108b8997b34f246cf8c65e5462e672f04aaa7c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8979f751ec4797986924a3efea84d02bc1c1cf303e1e992f349c9083f2fe4f08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8979f751ec4797986924a3efea84d02bc1c1cf303e1e992f349c9083f2fe4f08\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa87691b0dc314209d88d1f4d5b2227abc1de527e8111831bb8243ba6192c300\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fa87691b0dc314209d88d1f4d5b2227abc1de527e8111831bb8243ba6192c300\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:08:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://33f82b993e3d729a5830e88136e74687c930d6ad1a15c64f003f311fda418bce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://33f82b993e3d729a5830e88136e74687c930d6ad1a15c64f003f311fda418bce\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:08:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b640c26797535db11051d50aefd13b7b2759cfa19d163f53764b0fe8710cefdc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b640c26797535db11051d50aefd13b7b2759cfa19d163f53764b0fe8710cefdc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:08:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe5359762a36b39c9f8ae939a1b9d094c42ffab59c4b9743a83b4391c171a512\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fe5359762a36b39c9f8ae939a1b9d094c42ffab59c4b9743a83b4391c171a512\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:08:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc31f057b776cf1925d265b64c7a9a9fb0993e4f3ffff530a48f28865e1c2954\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dc31f057b776cf1925d265b64c7a9a9fb0993e4f3ffff530a48f28865e1c2954\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:08:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:08:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-mt7bq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:39Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:39 crc kubenswrapper[4816]: I0316 00:08:39.294179 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:39Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:39 crc kubenswrapper[4816]: I0316 00:08:39.309622 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:39Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:39 crc kubenswrapper[4816]: I0316 00:08:39.317834 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:39 crc kubenswrapper[4816]: I0316 00:08:39.317884 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:39 crc kubenswrapper[4816]: I0316 00:08:39.317897 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:39 crc kubenswrapper[4816]: I0316 00:08:39.317915 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:39 crc kubenswrapper[4816]: I0316 00:08:39.317927 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:39Z","lastTransitionTime":"2026-03-16T00:08:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:39 crc kubenswrapper[4816]: I0316 00:08:39.333305 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"49e35ef6-7a6d-438f-b598-4445d73db379\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fbe90b49c5d42bb9d6d5d1b8f1d02b571403dd3f39d3118081351d72c4910914\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://508e6dbe6f2f1392d16501e09ee2ec523a3c2a245cd96c6ab773e26e83b8bb6a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://15b5a3223ff0fb5b96e5b87ee836add66498165759f4d6ed8cb6413f091967e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://603bdcbd4a6af5fda7ca09e4823e0db8d7ec3e3eb38eddf9f062a14b433cb094\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8b061f16a65a27b9a5ef66231376b70fec084479a991b7fdd430957df76da32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b19c46be8bb0c6604bb4f702d14472b782280ce716a3b5fe87a4fa4f1b15b174\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b19c46be8bb0c6604bb4f702d14472b782280ce716a3b5fe87a4fa4f1b15b174\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://baceacd0418c0bbcbd2e875008286443b263336ba956d8e4584147ed949b3f69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://baceacd0418c0bbcbd2e875008286443b263336ba956d8e4584147ed949b3f69\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:49Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://24e7adc2ddce8a281cd270db47b3528d2a4965ab6d41a14ecb5aaea02daf0c45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://24e7adc2ddce8a281cd270db47b3528d2a4965ab6d41a14ecb5aaea02daf0c45\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:06:47Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:39Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:39 crc kubenswrapper[4816]: I0316 00:08:39.351798 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-szscw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e9789e58-12c8-4831-9401-af48a3e92209\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e6f0b7e8f95bbbf3b90133349b8b13d2784582a5d95dae8dd159328b570ae962\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mxf6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:08:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-szscw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:39Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:39 crc kubenswrapper[4816]: I0316 00:08:39.366225 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-jrdcz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dd08ece2-7636-4966-973a-e96a34b70b53\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6dd22d6548986bf126a8ce6b30c7c11d8c37bd0c03539a55cc28fb7debcfc26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trwbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7003a4592a48f87ab63e9f5637c7665040f9784a7c8f599c82ed61270ddb793c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trwbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:08:31Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-jrdcz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:39Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:39 crc kubenswrapper[4816]: I0316 00:08:39.387961 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-psjs7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1249f8b1c01db98230c10bde49bdb68f17caccc1ba0546e462df4384ca1658dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1249f8b1c01db98230c10bde49bdb68f17caccc1ba0546e462df4384ca1658dd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:08:31Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-psjs7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:39Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:39 crc kubenswrapper[4816]: I0316 00:08:39.399992 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://723654bc91ce4d40a27b14a9c4df1ed6c8dff2517f378927d7a6a96aeaa6951b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:39Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:39 crc kubenswrapper[4816]: I0316 00:08:39.413160 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://37869baff974556c22aadadfa4b528b5d6b9ad236107b4dbc945b3cdbf743f20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f9fe727708fdaca68596b7d305629ade45b061a88a81085f885d490ab99e24b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:39Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:39 crc kubenswrapper[4816]: I0316 00:08:39.420035 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:39 crc kubenswrapper[4816]: I0316 00:08:39.420072 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:39 crc kubenswrapper[4816]: I0316 00:08:39.420083 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:39 crc kubenswrapper[4816]: I0316 00:08:39.420102 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:39 crc kubenswrapper[4816]: I0316 00:08:39.420113 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:39Z","lastTransitionTime":"2026-03-16T00:08:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:39 crc kubenswrapper[4816]: I0316 00:08:39.423918 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-cnhkf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e686cd4-bddf-463e-b471-e49ea862691e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f6c3cacd4d7f6866e66b7234b280540ef2c443531de7a11f6894323ef24a9fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bwzt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:08:31Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-cnhkf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:39Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:39 crc kubenswrapper[4816]: I0316 00:08:39.437867 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:39Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:39 crc kubenswrapper[4816]: I0316 00:08:39.450864 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:39Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:39 crc kubenswrapper[4816]: I0316 00:08:39.475836 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"49e35ef6-7a6d-438f-b598-4445d73db379\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fbe90b49c5d42bb9d6d5d1b8f1d02b571403dd3f39d3118081351d72c4910914\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://508e6dbe6f2f1392d16501e09ee2ec523a3c2a245cd96c6ab773e26e83b8bb6a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://15b5a3223ff0fb5b96e5b87ee836add66498165759f4d6ed8cb6413f091967e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://603bdcbd4a6af5fda7ca09e4823e0db8d7ec3e3eb38eddf9f062a14b433cb094\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8b061f16a65a27b9a5ef66231376b70fec084479a991b7fdd430957df76da32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b19c46be8bb0c6604bb4f702d14472b782280ce716a3b5fe87a4fa4f1b15b174\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b19c46be8bb0c6604bb4f702d14472b782280ce716a3b5fe87a4fa4f1b15b174\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://baceacd0418c0bbcbd2e875008286443b263336ba956d8e4584147ed949b3f69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://baceacd0418c0bbcbd2e875008286443b263336ba956d8e4584147ed949b3f69\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:49Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://24e7adc2ddce8a281cd270db47b3528d2a4965ab6d41a14ecb5aaea02daf0c45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://24e7adc2ddce8a281cd270db47b3528d2a4965ab6d41a14ecb5aaea02daf0c45\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:06:47Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:39Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:39 crc kubenswrapper[4816]: I0316 00:08:39.495511 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-szscw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e9789e58-12c8-4831-9401-af48a3e92209\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e6f0b7e8f95bbbf3b90133349b8b13d2784582a5d95dae8dd159328b570ae962\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mxf6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:08:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-szscw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:39Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:39 crc kubenswrapper[4816]: I0316 00:08:39.510640 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-jrdcz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dd08ece2-7636-4966-973a-e96a34b70b53\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6dd22d6548986bf126a8ce6b30c7c11d8c37bd0c03539a55cc28fb7debcfc26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trwbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7003a4592a48f87ab63e9f5637c7665040f9784a7c8f599c82ed61270ddb793c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trwbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:08:31Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-jrdcz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:39Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:39 crc kubenswrapper[4816]: I0316 00:08:39.522757 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:39 crc kubenswrapper[4816]: I0316 00:08:39.522800 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:39 crc kubenswrapper[4816]: I0316 00:08:39.522809 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:39 crc kubenswrapper[4816]: I0316 00:08:39.522823 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:39 crc kubenswrapper[4816]: I0316 00:08:39.522831 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:39Z","lastTransitionTime":"2026-03-16T00:08:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:39 crc kubenswrapper[4816]: I0316 00:08:39.542643 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-psjs7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f47b6c54d5d72d827208f4d8228be260d9d19b4a720b3174349288dee2916641\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e0a510814d60106574e3cd3e612c031ac900caa0f24a12a62d7cd1f6bfda1ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://826495d8ab8f414169bf36159278da0a040fec699a74b4aebdc8f87ccdc48bd6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa5f052d32ad8f52f33ee5baea5af98a64f05b831b86ebced6c9422fb4845fcf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://86c28f1ae2f2fb61b7755ac14bc7f5346036735c75daccaa0e31b4e606cfb4c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0fac0a986edf984b0e34eda0f969205d5bc92f23d0edacb3c6dd8721eaed2fb9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2798f27d9234a85bd45aa5d5d9d6b95668aa8f3ac72933b52bb6b2bc9920651\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4d6bf581280690c99ddbbecd4cf4e215a12230d377978aa48acd3c05b85a3c56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1249f8b1c01db98230c10bde49bdb68f17caccc1ba0546e462df4384ca1658dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1249f8b1c01db98230c10bde49bdb68f17caccc1ba0546e462df4384ca1658dd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:08:31Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-psjs7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:39Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:39 crc kubenswrapper[4816]: I0316 00:08:39.563722 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://723654bc91ce4d40a27b14a9c4df1ed6c8dff2517f378927d7a6a96aeaa6951b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:39Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:39 crc kubenswrapper[4816]: I0316 00:08:39.584426 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://37869baff974556c22aadadfa4b528b5d6b9ad236107b4dbc945b3cdbf743f20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f9fe727708fdaca68596b7d305629ade45b061a88a81085f885d490ab99e24b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:39Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:39 crc kubenswrapper[4816]: I0316 00:08:39.601073 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-cnhkf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e686cd4-bddf-463e-b471-e49ea862691e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f6c3cacd4d7f6866e66b7234b280540ef2c443531de7a11f6894323ef24a9fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bwzt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:08:31Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-cnhkf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:39Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:39 crc kubenswrapper[4816]: I0316 00:08:39.622596 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c8786d1-025d-4788-bafe-c2c2eaf8e398\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://46db23088e8ba791d736f70a65bd066af80a5ec27c31332d9ac9c52530b21bfd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://da7b257af12b9ee605dc32a26d9565f866a9cafebb4936a0bde7f42ae9909f80\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2523ea43f2490afb81354be8a2840f54a3be1a8d3154196e0c8ac365d09723a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://29c4319e41bd025417b11cb87b682a66698ba7575801f247b1192dc8067ab66f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://29c4319e41bd025417b11cb87b682a66698ba7575801f247b1192dc8067ab66f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-16T00:07:59Z\\\",\\\"message\\\":\\\"le observer\\\\nW0316 00:07:59.373922 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0316 00:07:59.374075 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0316 00:07:59.374860 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2696044728/tls.crt::/tmp/serving-cert-2696044728/tls.key\\\\\\\"\\\\nI0316 00:07:59.560928 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0316 00:07:59.563996 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0316 00:07:59.564013 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0316 00:07:59.564035 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0316 00:07:59.564041 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0316 00:07:59.571475 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0316 00:07:59.571523 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0316 00:07:59.571531 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0316 00:07:59.571540 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0316 00:07:59.571574 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0316 00:07:59.571588 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nI0316 00:07:59.571493 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0316 00:07:59.571597 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0316 00:07:59.573484 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-16T00:07:58Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c81d0c40be9c3912864280cd98481f837ef29f69d85599b39decf5e9d7a97eff\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:50Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e39d816de31fa1c42a254ec61527496e7e85121f67007a774524167b1063cf1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e39d816de31fa1c42a254ec61527496e7e85121f67007a774524167b1063cf1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:06:47Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:39Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:39 crc kubenswrapper[4816]: I0316 00:08:39.625479 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:39 crc kubenswrapper[4816]: I0316 00:08:39.625536 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:39 crc kubenswrapper[4816]: I0316 00:08:39.625575 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:39 crc kubenswrapper[4816]: I0316 00:08:39.625601 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:39 crc kubenswrapper[4816]: I0316 00:08:39.625623 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:39Z","lastTransitionTime":"2026-03-16T00:08:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:39 crc kubenswrapper[4816]: I0316 00:08:39.642201 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://516852108b3d17a15676573ff080d3d94fa48d8076ed0404a8d827e715c2555e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:39Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:39 crc kubenswrapper[4816]: I0316 00:08:39.655487 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:39Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:39 crc kubenswrapper[4816]: I0316 00:08:39.677118 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-mt7bq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"03ef49f1-0c6a-443a-8df3-2db339c562ed\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d31459f93f54cc1cd99638b1f108b8997b34f246cf8c65e5462e672f04aaa7c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8979f751ec4797986924a3efea84d02bc1c1cf303e1e992f349c9083f2fe4f08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8979f751ec4797986924a3efea84d02bc1c1cf303e1e992f349c9083f2fe4f08\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa87691b0dc314209d88d1f4d5b2227abc1de527e8111831bb8243ba6192c300\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fa87691b0dc314209d88d1f4d5b2227abc1de527e8111831bb8243ba6192c300\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:08:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://33f82b993e3d729a5830e88136e74687c930d6ad1a15c64f003f311fda418bce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://33f82b993e3d729a5830e88136e74687c930d6ad1a15c64f003f311fda418bce\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:08:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b640c26797535db11051d50aefd13b7b2759cfa19d163f53764b0fe8710cefdc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b640c26797535db11051d50aefd13b7b2759cfa19d163f53764b0fe8710cefdc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:08:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe5359762a36b39c9f8ae939a1b9d094c42ffab59c4b9743a83b4391c171a512\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fe5359762a36b39c9f8ae939a1b9d094c42ffab59c4b9743a83b4391c171a512\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:08:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc31f057b776cf1925d265b64c7a9a9fb0993e4f3ffff530a48f28865e1c2954\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dc31f057b776cf1925d265b64c7a9a9fb0993e4f3ffff530a48f28865e1c2954\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:08:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:08:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-mt7bq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:39Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:39 crc kubenswrapper[4816]: I0316 00:08:39.688316 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-lhpbn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6ec6a8ee-efd9-45df-bb35-706fcc90ebe9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a34b33696ef85102800de34359b2f8b4bf11c03ca6347dde766456bcea20e0e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c2x8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:08:38Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-lhpbn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:39Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:39 crc kubenswrapper[4816]: I0316 00:08:39.728423 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:39 crc kubenswrapper[4816]: I0316 00:08:39.728477 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:39 crc kubenswrapper[4816]: I0316 00:08:39.728491 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:39 crc kubenswrapper[4816]: I0316 00:08:39.728511 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:39 crc kubenswrapper[4816]: I0316 00:08:39.728529 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:39Z","lastTransitionTime":"2026-03-16T00:08:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:39 crc kubenswrapper[4816]: I0316 00:08:39.831668 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:39 crc kubenswrapper[4816]: I0316 00:08:39.831697 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:39 crc kubenswrapper[4816]: I0316 00:08:39.831706 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:39 crc kubenswrapper[4816]: I0316 00:08:39.831718 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:39 crc kubenswrapper[4816]: I0316 00:08:39.831726 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:39Z","lastTransitionTime":"2026-03-16T00:08:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:39 crc kubenswrapper[4816]: I0316 00:08:39.934446 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:39 crc kubenswrapper[4816]: I0316 00:08:39.934494 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:39 crc kubenswrapper[4816]: I0316 00:08:39.934511 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:39 crc kubenswrapper[4816]: I0316 00:08:39.934535 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:39 crc kubenswrapper[4816]: I0316 00:08:39.934576 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:39Z","lastTransitionTime":"2026-03-16T00:08:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:40 crc kubenswrapper[4816]: I0316 00:08:40.037152 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:40 crc kubenswrapper[4816]: I0316 00:08:40.037217 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:40 crc kubenswrapper[4816]: I0316 00:08:40.037242 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:40 crc kubenswrapper[4816]: I0316 00:08:40.037274 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:40 crc kubenswrapper[4816]: I0316 00:08:40.037297 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:40Z","lastTransitionTime":"2026-03-16T00:08:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:40 crc kubenswrapper[4816]: I0316 00:08:40.141700 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:40 crc kubenswrapper[4816]: I0316 00:08:40.141754 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:40 crc kubenswrapper[4816]: I0316 00:08:40.141788 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:40 crc kubenswrapper[4816]: I0316 00:08:40.141811 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:40 crc kubenswrapper[4816]: I0316 00:08:40.141826 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:40Z","lastTransitionTime":"2026-03-16T00:08:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:40 crc kubenswrapper[4816]: I0316 00:08:40.244489 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:40 crc kubenswrapper[4816]: I0316 00:08:40.244574 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:40 crc kubenswrapper[4816]: I0316 00:08:40.244591 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:40 crc kubenswrapper[4816]: I0316 00:08:40.244611 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:40 crc kubenswrapper[4816]: I0316 00:08:40.244628 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:40Z","lastTransitionTime":"2026-03-16T00:08:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:40 crc kubenswrapper[4816]: I0316 00:08:40.347651 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:40 crc kubenswrapper[4816]: I0316 00:08:40.347723 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:40 crc kubenswrapper[4816]: I0316 00:08:40.347740 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:40 crc kubenswrapper[4816]: I0316 00:08:40.347763 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:40 crc kubenswrapper[4816]: I0316 00:08:40.347782 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:40Z","lastTransitionTime":"2026-03-16T00:08:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:40 crc kubenswrapper[4816]: I0316 00:08:40.417867 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 16 00:08:40 crc kubenswrapper[4816]: E0316 00:08:40.418122 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-16 00:09:12.418076803 +0000 UTC m=+145.514376796 (durationBeforeRetry 32s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:08:40 crc kubenswrapper[4816]: I0316 00:08:40.450786 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:40 crc kubenswrapper[4816]: I0316 00:08:40.450845 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:40 crc kubenswrapper[4816]: I0316 00:08:40.450862 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:40 crc kubenswrapper[4816]: I0316 00:08:40.450887 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:40 crc kubenswrapper[4816]: I0316 00:08:40.450906 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:40Z","lastTransitionTime":"2026-03-16T00:08:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:40 crc kubenswrapper[4816]: I0316 00:08:40.519737 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 16 00:08:40 crc kubenswrapper[4816]: I0316 00:08:40.519816 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 16 00:08:40 crc kubenswrapper[4816]: I0316 00:08:40.519866 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 16 00:08:40 crc kubenswrapper[4816]: I0316 00:08:40.519904 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 16 00:08:40 crc kubenswrapper[4816]: E0316 00:08:40.520073 4816 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 16 00:08:40 crc kubenswrapper[4816]: E0316 00:08:40.520100 4816 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 16 00:08:40 crc kubenswrapper[4816]: E0316 00:08:40.520123 4816 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 16 00:08:40 crc kubenswrapper[4816]: E0316 00:08:40.520205 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-16 00:09:12.520177126 +0000 UTC m=+145.616477119 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 16 00:08:40 crc kubenswrapper[4816]: E0316 00:08:40.520201 4816 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 16 00:08:40 crc kubenswrapper[4816]: E0316 00:08:40.520263 4816 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 16 00:08:40 crc kubenswrapper[4816]: E0316 00:08:40.520318 4816 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 16 00:08:40 crc kubenswrapper[4816]: E0316 00:08:40.520343 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-16 00:09:12.52031441 +0000 UTC m=+145.616614393 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 16 00:08:40 crc kubenswrapper[4816]: E0316 00:08:40.520358 4816 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 16 00:08:40 crc kubenswrapper[4816]: E0316 00:08:40.520372 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-16 00:09:12.520358702 +0000 UTC m=+145.616658685 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 16 00:08:40 crc kubenswrapper[4816]: E0316 00:08:40.520384 4816 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 16 00:08:40 crc kubenswrapper[4816]: E0316 00:08:40.520470 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-16 00:09:12.520446385 +0000 UTC m=+145.616746408 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 16 00:08:40 crc kubenswrapper[4816]: I0316 00:08:40.554594 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:40 crc kubenswrapper[4816]: I0316 00:08:40.554655 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:40 crc kubenswrapper[4816]: I0316 00:08:40.554671 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:40 crc kubenswrapper[4816]: I0316 00:08:40.554697 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:40 crc kubenswrapper[4816]: I0316 00:08:40.554714 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:40Z","lastTransitionTime":"2026-03-16T00:08:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:40 crc kubenswrapper[4816]: I0316 00:08:40.657656 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:40 crc kubenswrapper[4816]: I0316 00:08:40.657689 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:40 crc kubenswrapper[4816]: I0316 00:08:40.657698 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:40 crc kubenswrapper[4816]: I0316 00:08:40.657713 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:40 crc kubenswrapper[4816]: I0316 00:08:40.657724 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:40Z","lastTransitionTime":"2026-03-16T00:08:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:40 crc kubenswrapper[4816]: I0316 00:08:40.667722 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 16 00:08:40 crc kubenswrapper[4816]: I0316 00:08:40.667794 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 16 00:08:40 crc kubenswrapper[4816]: I0316 00:08:40.667819 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 16 00:08:40 crc kubenswrapper[4816]: E0316 00:08:40.667898 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 16 00:08:40 crc kubenswrapper[4816]: E0316 00:08:40.668006 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 16 00:08:40 crc kubenswrapper[4816]: E0316 00:08:40.668087 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 16 00:08:40 crc kubenswrapper[4816]: I0316 00:08:40.759676 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:40 crc kubenswrapper[4816]: I0316 00:08:40.759715 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:40 crc kubenswrapper[4816]: I0316 00:08:40.759725 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:40 crc kubenswrapper[4816]: I0316 00:08:40.759743 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:40 crc kubenswrapper[4816]: I0316 00:08:40.759756 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:40Z","lastTransitionTime":"2026-03-16T00:08:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:40 crc kubenswrapper[4816]: I0316 00:08:40.861928 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:40 crc kubenswrapper[4816]: I0316 00:08:40.861964 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:40 crc kubenswrapper[4816]: I0316 00:08:40.861974 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:40 crc kubenswrapper[4816]: I0316 00:08:40.861989 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:40 crc kubenswrapper[4816]: I0316 00:08:40.862002 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:40Z","lastTransitionTime":"2026-03-16T00:08:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:40 crc kubenswrapper[4816]: I0316 00:08:40.963981 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:40 crc kubenswrapper[4816]: I0316 00:08:40.964264 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:40 crc kubenswrapper[4816]: I0316 00:08:40.964273 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:40 crc kubenswrapper[4816]: I0316 00:08:40.964287 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:40 crc kubenswrapper[4816]: I0316 00:08:40.964295 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:40Z","lastTransitionTime":"2026-03-16T00:08:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:41 crc kubenswrapper[4816]: I0316 00:08:41.068542 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:41 crc kubenswrapper[4816]: I0316 00:08:41.068645 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:41 crc kubenswrapper[4816]: I0316 00:08:41.068721 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:41 crc kubenswrapper[4816]: I0316 00:08:41.068754 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:41 crc kubenswrapper[4816]: I0316 00:08:41.068800 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:41Z","lastTransitionTime":"2026-03-16T00:08:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:41 crc kubenswrapper[4816]: I0316 00:08:41.171637 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:41 crc kubenswrapper[4816]: I0316 00:08:41.171679 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:41 crc kubenswrapper[4816]: I0316 00:08:41.171691 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:41 crc kubenswrapper[4816]: I0316 00:08:41.171707 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:41 crc kubenswrapper[4816]: I0316 00:08:41.171718 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:41Z","lastTransitionTime":"2026-03-16T00:08:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:41 crc kubenswrapper[4816]: I0316 00:08:41.275098 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:41 crc kubenswrapper[4816]: I0316 00:08:41.275147 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:41 crc kubenswrapper[4816]: I0316 00:08:41.275156 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:41 crc kubenswrapper[4816]: I0316 00:08:41.275172 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:41 crc kubenswrapper[4816]: I0316 00:08:41.275182 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:41Z","lastTransitionTime":"2026-03-16T00:08:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:41 crc kubenswrapper[4816]: I0316 00:08:41.378378 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:41 crc kubenswrapper[4816]: I0316 00:08:41.378415 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:41 crc kubenswrapper[4816]: I0316 00:08:41.378429 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:41 crc kubenswrapper[4816]: I0316 00:08:41.378446 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:41 crc kubenswrapper[4816]: I0316 00:08:41.378456 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:41Z","lastTransitionTime":"2026-03-16T00:08:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:41 crc kubenswrapper[4816]: I0316 00:08:41.481923 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:41 crc kubenswrapper[4816]: I0316 00:08:41.481987 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:41 crc kubenswrapper[4816]: I0316 00:08:41.482016 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:41 crc kubenswrapper[4816]: I0316 00:08:41.482048 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:41 crc kubenswrapper[4816]: I0316 00:08:41.482070 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:41Z","lastTransitionTime":"2026-03-16T00:08:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:41 crc kubenswrapper[4816]: I0316 00:08:41.585878 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:41 crc kubenswrapper[4816]: I0316 00:08:41.585945 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:41 crc kubenswrapper[4816]: I0316 00:08:41.585962 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:41 crc kubenswrapper[4816]: I0316 00:08:41.585988 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:41 crc kubenswrapper[4816]: I0316 00:08:41.586007 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:41Z","lastTransitionTime":"2026-03-16T00:08:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:41 crc kubenswrapper[4816]: I0316 00:08:41.688945 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:41 crc kubenswrapper[4816]: I0316 00:08:41.689000 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:41 crc kubenswrapper[4816]: I0316 00:08:41.689017 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:41 crc kubenswrapper[4816]: I0316 00:08:41.689045 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:41 crc kubenswrapper[4816]: I0316 00:08:41.689063 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:41Z","lastTransitionTime":"2026-03-16T00:08:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:41 crc kubenswrapper[4816]: I0316 00:08:41.792583 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:41 crc kubenswrapper[4816]: I0316 00:08:41.792632 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:41 crc kubenswrapper[4816]: I0316 00:08:41.792648 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:41 crc kubenswrapper[4816]: I0316 00:08:41.792670 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:41 crc kubenswrapper[4816]: I0316 00:08:41.792688 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:41Z","lastTransitionTime":"2026-03-16T00:08:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:41 crc kubenswrapper[4816]: I0316 00:08:41.895898 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:41 crc kubenswrapper[4816]: I0316 00:08:41.895954 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:41 crc kubenswrapper[4816]: I0316 00:08:41.895972 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:41 crc kubenswrapper[4816]: I0316 00:08:41.895997 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:41 crc kubenswrapper[4816]: I0316 00:08:41.896016 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:41Z","lastTransitionTime":"2026-03-16T00:08:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:41 crc kubenswrapper[4816]: I0316 00:08:41.999910 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:41 crc kubenswrapper[4816]: I0316 00:08:41.999982 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:42 crc kubenswrapper[4816]: I0316 00:08:42.000001 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:42 crc kubenswrapper[4816]: I0316 00:08:42.000026 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:42 crc kubenswrapper[4816]: I0316 00:08:42.000045 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:42Z","lastTransitionTime":"2026-03-16T00:08:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:42 crc kubenswrapper[4816]: I0316 00:08:42.103133 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:42 crc kubenswrapper[4816]: I0316 00:08:42.103195 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:42 crc kubenswrapper[4816]: I0316 00:08:42.103212 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:42 crc kubenswrapper[4816]: I0316 00:08:42.103238 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:42 crc kubenswrapper[4816]: I0316 00:08:42.103258 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:42Z","lastTransitionTime":"2026-03-16T00:08:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:42 crc kubenswrapper[4816]: I0316 00:08:42.206205 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:42 crc kubenswrapper[4816]: I0316 00:08:42.206258 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:42 crc kubenswrapper[4816]: I0316 00:08:42.206277 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:42 crc kubenswrapper[4816]: I0316 00:08:42.206301 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:42 crc kubenswrapper[4816]: I0316 00:08:42.206318 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:42Z","lastTransitionTime":"2026-03-16T00:08:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:42 crc kubenswrapper[4816]: I0316 00:08:42.225300 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-psjs7_2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88/ovnkube-controller/0.log" Mar 16 00:08:42 crc kubenswrapper[4816]: I0316 00:08:42.230351 4816 generic.go:334] "Generic (PLEG): container finished" podID="2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88" containerID="a2798f27d9234a85bd45aa5d5d9d6b95668aa8f3ac72933b52bb6b2bc9920651" exitCode=1 Mar 16 00:08:42 crc kubenswrapper[4816]: I0316 00:08:42.230417 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-psjs7" event={"ID":"2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88","Type":"ContainerDied","Data":"a2798f27d9234a85bd45aa5d5d9d6b95668aa8f3ac72933b52bb6b2bc9920651"} Mar 16 00:08:42 crc kubenswrapper[4816]: I0316 00:08:42.231645 4816 scope.go:117] "RemoveContainer" containerID="a2798f27d9234a85bd45aa5d5d9d6b95668aa8f3ac72933b52bb6b2bc9920651" Mar 16 00:08:42 crc kubenswrapper[4816]: I0316 00:08:42.256864 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-szscw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e9789e58-12c8-4831-9401-af48a3e92209\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e6f0b7e8f95bbbf3b90133349b8b13d2784582a5d95dae8dd159328b570ae962\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mxf6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:08:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-szscw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:42Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:42 crc kubenswrapper[4816]: I0316 00:08:42.277463 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-jrdcz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dd08ece2-7636-4966-973a-e96a34b70b53\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6dd22d6548986bf126a8ce6b30c7c11d8c37bd0c03539a55cc28fb7debcfc26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trwbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7003a4592a48f87ab63e9f5637c7665040f9784a7c8f599c82ed61270ddb793c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trwbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:08:31Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-jrdcz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:42Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:42 crc kubenswrapper[4816]: I0316 00:08:42.318462 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:42 crc kubenswrapper[4816]: I0316 00:08:42.318516 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:42 crc kubenswrapper[4816]: I0316 00:08:42.318537 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:42 crc kubenswrapper[4816]: I0316 00:08:42.318590 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:42 crc kubenswrapper[4816]: I0316 00:08:42.318612 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:42Z","lastTransitionTime":"2026-03-16T00:08:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:42 crc kubenswrapper[4816]: I0316 00:08:42.328977 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-psjs7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f47b6c54d5d72d827208f4d8228be260d9d19b4a720b3174349288dee2916641\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e0a510814d60106574e3cd3e612c031ac900caa0f24a12a62d7cd1f6bfda1ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://826495d8ab8f414169bf36159278da0a040fec699a74b4aebdc8f87ccdc48bd6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa5f052d32ad8f52f33ee5baea5af98a64f05b831b86ebced6c9422fb4845fcf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://86c28f1ae2f2fb61b7755ac14bc7f5346036735c75daccaa0e31b4e606cfb4c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0fac0a986edf984b0e34eda0f969205d5bc92f23d0edacb3c6dd8721eaed2fb9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2798f27d9234a85bd45aa5d5d9d6b95668aa8f3ac72933b52bb6b2bc9920651\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a2798f27d9234a85bd45aa5d5d9d6b95668aa8f3ac72933b52bb6b2bc9920651\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-16T00:08:41Z\\\",\\\"message\\\":\\\"ng *v1.Namespace event handler 1 for removal\\\\nI0316 00:08:41.339626 6672 reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI0316 00:08:41.339647 6672 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0316 00:08:41.339627 6672 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0316 00:08:41.339804 6672 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0316 00:08:41.339837 6672 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0316 00:08:41.339843 6672 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0316 00:08:41.339871 6672 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0316 00:08:41.340083 6672 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0316 00:08:41.340097 6672 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0316 00:08:41.340108 6672 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0316 00:08:41.340116 6672 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0316 00:08:41.340212 6672 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0316 00:08:41.340278 6672 factory.go:656] Stopping watch factory\\\\nI0316 00:08:41.340297 6672 ovnkube.go:599] Stopped ovnkube\\\\nI0316 0\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4d6bf581280690c99ddbbecd4cf4e215a12230d377978aa48acd3c05b85a3c56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1249f8b1c01db98230c10bde49bdb68f17caccc1ba0546e462df4384ca1658dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1249f8b1c01db98230c10bde49bdb68f17caccc1ba0546e462df4384ca1658dd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:08:31Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-psjs7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:42Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:42 crc kubenswrapper[4816]: I0316 00:08:42.370529 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"49e35ef6-7a6d-438f-b598-4445d73db379\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fbe90b49c5d42bb9d6d5d1b8f1d02b571403dd3f39d3118081351d72c4910914\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://508e6dbe6f2f1392d16501e09ee2ec523a3c2a245cd96c6ab773e26e83b8bb6a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://15b5a3223ff0fb5b96e5b87ee836add66498165759f4d6ed8cb6413f091967e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://603bdcbd4a6af5fda7ca09e4823e0db8d7ec3e3eb38eddf9f062a14b433cb094\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8b061f16a65a27b9a5ef66231376b70fec084479a991b7fdd430957df76da32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b19c46be8bb0c6604bb4f702d14472b782280ce716a3b5fe87a4fa4f1b15b174\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b19c46be8bb0c6604bb4f702d14472b782280ce716a3b5fe87a4fa4f1b15b174\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://baceacd0418c0bbcbd2e875008286443b263336ba956d8e4584147ed949b3f69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://baceacd0418c0bbcbd2e875008286443b263336ba956d8e4584147ed949b3f69\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:49Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://24e7adc2ddce8a281cd270db47b3528d2a4965ab6d41a14ecb5aaea02daf0c45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://24e7adc2ddce8a281cd270db47b3528d2a4965ab6d41a14ecb5aaea02daf0c45\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:06:47Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:42Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:42 crc kubenswrapper[4816]: I0316 00:08:42.386115 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://723654bc91ce4d40a27b14a9c4df1ed6c8dff2517f378927d7a6a96aeaa6951b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:42Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:42 crc kubenswrapper[4816]: I0316 00:08:42.397322 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://37869baff974556c22aadadfa4b528b5d6b9ad236107b4dbc945b3cdbf743f20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f9fe727708fdaca68596b7d305629ade45b061a88a81085f885d490ab99e24b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:42Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:42 crc kubenswrapper[4816]: I0316 00:08:42.406256 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-cnhkf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e686cd4-bddf-463e-b471-e49ea862691e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f6c3cacd4d7f6866e66b7234b280540ef2c443531de7a11f6894323ef24a9fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bwzt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:08:31Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-cnhkf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:42Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:42 crc kubenswrapper[4816]: I0316 00:08:42.416967 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://516852108b3d17a15676573ff080d3d94fa48d8076ed0404a8d827e715c2555e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:42Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:42 crc kubenswrapper[4816]: I0316 00:08:42.420324 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:42 crc kubenswrapper[4816]: I0316 00:08:42.420358 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:42 crc kubenswrapper[4816]: I0316 00:08:42.420369 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:42 crc kubenswrapper[4816]: I0316 00:08:42.420386 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:42 crc kubenswrapper[4816]: I0316 00:08:42.420397 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:42Z","lastTransitionTime":"2026-03-16T00:08:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:42 crc kubenswrapper[4816]: I0316 00:08:42.434196 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:42Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:42 crc kubenswrapper[4816]: I0316 00:08:42.450586 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-mt7bq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"03ef49f1-0c6a-443a-8df3-2db339c562ed\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d31459f93f54cc1cd99638b1f108b8997b34f246cf8c65e5462e672f04aaa7c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8979f751ec4797986924a3efea84d02bc1c1cf303e1e992f349c9083f2fe4f08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8979f751ec4797986924a3efea84d02bc1c1cf303e1e992f349c9083f2fe4f08\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa87691b0dc314209d88d1f4d5b2227abc1de527e8111831bb8243ba6192c300\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fa87691b0dc314209d88d1f4d5b2227abc1de527e8111831bb8243ba6192c300\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:08:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://33f82b993e3d729a5830e88136e74687c930d6ad1a15c64f003f311fda418bce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://33f82b993e3d729a5830e88136e74687c930d6ad1a15c64f003f311fda418bce\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:08:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b640c26797535db11051d50aefd13b7b2759cfa19d163f53764b0fe8710cefdc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b640c26797535db11051d50aefd13b7b2759cfa19d163f53764b0fe8710cefdc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:08:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe5359762a36b39c9f8ae939a1b9d094c42ffab59c4b9743a83b4391c171a512\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fe5359762a36b39c9f8ae939a1b9d094c42ffab59c4b9743a83b4391c171a512\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:08:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc31f057b776cf1925d265b64c7a9a9fb0993e4f3ffff530a48f28865e1c2954\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dc31f057b776cf1925d265b64c7a9a9fb0993e4f3ffff530a48f28865e1c2954\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:08:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:08:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-mt7bq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:42Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:42 crc kubenswrapper[4816]: I0316 00:08:42.462330 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-lhpbn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6ec6a8ee-efd9-45df-bb35-706fcc90ebe9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a34b33696ef85102800de34359b2f8b4bf11c03ca6347dde766456bcea20e0e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c2x8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:08:38Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-lhpbn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:42Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:42 crc kubenswrapper[4816]: I0316 00:08:42.480873 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c8786d1-025d-4788-bafe-c2c2eaf8e398\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://46db23088e8ba791d736f70a65bd066af80a5ec27c31332d9ac9c52530b21bfd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://da7b257af12b9ee605dc32a26d9565f866a9cafebb4936a0bde7f42ae9909f80\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2523ea43f2490afb81354be8a2840f54a3be1a8d3154196e0c8ac365d09723a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://29c4319e41bd025417b11cb87b682a66698ba7575801f247b1192dc8067ab66f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://29c4319e41bd025417b11cb87b682a66698ba7575801f247b1192dc8067ab66f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-16T00:07:59Z\\\",\\\"message\\\":\\\"le observer\\\\nW0316 00:07:59.373922 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0316 00:07:59.374075 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0316 00:07:59.374860 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2696044728/tls.crt::/tmp/serving-cert-2696044728/tls.key\\\\\\\"\\\\nI0316 00:07:59.560928 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0316 00:07:59.563996 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0316 00:07:59.564013 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0316 00:07:59.564035 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0316 00:07:59.564041 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0316 00:07:59.571475 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0316 00:07:59.571523 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0316 00:07:59.571531 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0316 00:07:59.571540 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0316 00:07:59.571574 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0316 00:07:59.571588 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nI0316 00:07:59.571493 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0316 00:07:59.571597 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0316 00:07:59.573484 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-16T00:07:58Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c81d0c40be9c3912864280cd98481f837ef29f69d85599b39decf5e9d7a97eff\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:50Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e39d816de31fa1c42a254ec61527496e7e85121f67007a774524167b1063cf1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e39d816de31fa1c42a254ec61527496e7e85121f67007a774524167b1063cf1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:06:47Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:42Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:42 crc kubenswrapper[4816]: I0316 00:08:42.494423 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:42Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:42 crc kubenswrapper[4816]: I0316 00:08:42.508308 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:42Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:42 crc kubenswrapper[4816]: I0316 00:08:42.523155 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:42 crc kubenswrapper[4816]: I0316 00:08:42.523223 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:42 crc kubenswrapper[4816]: I0316 00:08:42.523243 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:42 crc kubenswrapper[4816]: I0316 00:08:42.523314 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:42 crc kubenswrapper[4816]: I0316 00:08:42.523326 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:42Z","lastTransitionTime":"2026-03-16T00:08:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:42 crc kubenswrapper[4816]: I0316 00:08:42.625361 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:42 crc kubenswrapper[4816]: I0316 00:08:42.625407 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:42 crc kubenswrapper[4816]: I0316 00:08:42.625424 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:42 crc kubenswrapper[4816]: I0316 00:08:42.625445 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:42 crc kubenswrapper[4816]: I0316 00:08:42.625457 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:42Z","lastTransitionTime":"2026-03-16T00:08:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:42 crc kubenswrapper[4816]: I0316 00:08:42.667485 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 16 00:08:42 crc kubenswrapper[4816]: I0316 00:08:42.667506 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 16 00:08:42 crc kubenswrapper[4816]: E0316 00:08:42.667638 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 16 00:08:42 crc kubenswrapper[4816]: I0316 00:08:42.667605 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 16 00:08:42 crc kubenswrapper[4816]: E0316 00:08:42.667687 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 16 00:08:42 crc kubenswrapper[4816]: E0316 00:08:42.667894 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 16 00:08:42 crc kubenswrapper[4816]: I0316 00:08:42.727967 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:42 crc kubenswrapper[4816]: I0316 00:08:42.728014 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:42 crc kubenswrapper[4816]: I0316 00:08:42.728026 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:42 crc kubenswrapper[4816]: I0316 00:08:42.728043 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:42 crc kubenswrapper[4816]: I0316 00:08:42.728054 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:42Z","lastTransitionTime":"2026-03-16T00:08:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:42 crc kubenswrapper[4816]: I0316 00:08:42.830266 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:42 crc kubenswrapper[4816]: I0316 00:08:42.830318 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:42 crc kubenswrapper[4816]: I0316 00:08:42.830334 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:42 crc kubenswrapper[4816]: I0316 00:08:42.830352 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:42 crc kubenswrapper[4816]: I0316 00:08:42.830361 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:42Z","lastTransitionTime":"2026-03-16T00:08:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:42 crc kubenswrapper[4816]: I0316 00:08:42.932644 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:42 crc kubenswrapper[4816]: I0316 00:08:42.932676 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:42 crc kubenswrapper[4816]: I0316 00:08:42.932685 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:42 crc kubenswrapper[4816]: I0316 00:08:42.932700 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:42 crc kubenswrapper[4816]: I0316 00:08:42.932727 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:42Z","lastTransitionTime":"2026-03-16T00:08:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:43 crc kubenswrapper[4816]: I0316 00:08:43.068162 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:43 crc kubenswrapper[4816]: I0316 00:08:43.068211 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:43 crc kubenswrapper[4816]: I0316 00:08:43.068227 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:43 crc kubenswrapper[4816]: I0316 00:08:43.068249 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:43 crc kubenswrapper[4816]: I0316 00:08:43.068445 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:43Z","lastTransitionTime":"2026-03-16T00:08:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:43 crc kubenswrapper[4816]: I0316 00:08:43.170687 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:43 crc kubenswrapper[4816]: I0316 00:08:43.170739 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:43 crc kubenswrapper[4816]: I0316 00:08:43.170753 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:43 crc kubenswrapper[4816]: I0316 00:08:43.170774 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:43 crc kubenswrapper[4816]: I0316 00:08:43.170787 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:43Z","lastTransitionTime":"2026-03-16T00:08:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:43 crc kubenswrapper[4816]: I0316 00:08:43.237134 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-psjs7_2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88/ovnkube-controller/0.log" Mar 16 00:08:43 crc kubenswrapper[4816]: I0316 00:08:43.241306 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-psjs7" event={"ID":"2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88","Type":"ContainerStarted","Data":"f8dc5c9e05d9b292469e8d707f3727adbf33725ea339bcfefead2bbdbb094400"} Mar 16 00:08:43 crc kubenswrapper[4816]: I0316 00:08:43.241917 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-psjs7" Mar 16 00:08:43 crc kubenswrapper[4816]: I0316 00:08:43.263292 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:43Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:43 crc kubenswrapper[4816]: I0316 00:08:43.272838 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:43 crc kubenswrapper[4816]: I0316 00:08:43.272866 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:43 crc kubenswrapper[4816]: I0316 00:08:43.272875 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:43 crc kubenswrapper[4816]: I0316 00:08:43.272889 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:43 crc kubenswrapper[4816]: I0316 00:08:43.272898 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:43Z","lastTransitionTime":"2026-03-16T00:08:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:43 crc kubenswrapper[4816]: I0316 00:08:43.283065 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:43Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:43 crc kubenswrapper[4816]: I0316 00:08:43.308994 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-psjs7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f47b6c54d5d72d827208f4d8228be260d9d19b4a720b3174349288dee2916641\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e0a510814d60106574e3cd3e612c031ac900caa0f24a12a62d7cd1f6bfda1ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://826495d8ab8f414169bf36159278da0a040fec699a74b4aebdc8f87ccdc48bd6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa5f052d32ad8f52f33ee5baea5af98a64f05b831b86ebced6c9422fb4845fcf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://86c28f1ae2f2fb61b7755ac14bc7f5346036735c75daccaa0e31b4e606cfb4c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0fac0a986edf984b0e34eda0f969205d5bc92f23d0edacb3c6dd8721eaed2fb9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8dc5c9e05d9b292469e8d707f3727adbf33725ea339bcfefead2bbdbb094400\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a2798f27d9234a85bd45aa5d5d9d6b95668aa8f3ac72933b52bb6b2bc9920651\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-16T00:08:41Z\\\",\\\"message\\\":\\\"ng *v1.Namespace event handler 1 for removal\\\\nI0316 00:08:41.339626 6672 reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI0316 00:08:41.339647 6672 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0316 00:08:41.339627 6672 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0316 00:08:41.339804 6672 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0316 00:08:41.339837 6672 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0316 00:08:41.339843 6672 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0316 00:08:41.339871 6672 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0316 00:08:41.340083 6672 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0316 00:08:41.340097 6672 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0316 00:08:41.340108 6672 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0316 00:08:41.340116 6672 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0316 00:08:41.340212 6672 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0316 00:08:41.340278 6672 factory.go:656] Stopping watch factory\\\\nI0316 00:08:41.340297 6672 ovnkube.go:599] Stopped ovnkube\\\\nI0316 0\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:38Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4d6bf581280690c99ddbbecd4cf4e215a12230d377978aa48acd3c05b85a3c56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1249f8b1c01db98230c10bde49bdb68f17caccc1ba0546e462df4384ca1658dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1249f8b1c01db98230c10bde49bdb68f17caccc1ba0546e462df4384ca1658dd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:08:31Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-psjs7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:43Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:43 crc kubenswrapper[4816]: I0316 00:08:43.327834 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"49e35ef6-7a6d-438f-b598-4445d73db379\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fbe90b49c5d42bb9d6d5d1b8f1d02b571403dd3f39d3118081351d72c4910914\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://508e6dbe6f2f1392d16501e09ee2ec523a3c2a245cd96c6ab773e26e83b8bb6a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://15b5a3223ff0fb5b96e5b87ee836add66498165759f4d6ed8cb6413f091967e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://603bdcbd4a6af5fda7ca09e4823e0db8d7ec3e3eb38eddf9f062a14b433cb094\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8b061f16a65a27b9a5ef66231376b70fec084479a991b7fdd430957df76da32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b19c46be8bb0c6604bb4f702d14472b782280ce716a3b5fe87a4fa4f1b15b174\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b19c46be8bb0c6604bb4f702d14472b782280ce716a3b5fe87a4fa4f1b15b174\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://baceacd0418c0bbcbd2e875008286443b263336ba956d8e4584147ed949b3f69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://baceacd0418c0bbcbd2e875008286443b263336ba956d8e4584147ed949b3f69\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:49Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://24e7adc2ddce8a281cd270db47b3528d2a4965ab6d41a14ecb5aaea02daf0c45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://24e7adc2ddce8a281cd270db47b3528d2a4965ab6d41a14ecb5aaea02daf0c45\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:06:47Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:43Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:43 crc kubenswrapper[4816]: I0316 00:08:43.357186 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-szscw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e9789e58-12c8-4831-9401-af48a3e92209\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e6f0b7e8f95bbbf3b90133349b8b13d2784582a5d95dae8dd159328b570ae962\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mxf6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:08:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-szscw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:43Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:43 crc kubenswrapper[4816]: I0316 00:08:43.370270 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-jrdcz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dd08ece2-7636-4966-973a-e96a34b70b53\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6dd22d6548986bf126a8ce6b30c7c11d8c37bd0c03539a55cc28fb7debcfc26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trwbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7003a4592a48f87ab63e9f5637c7665040f9784a7c8f599c82ed61270ddb793c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trwbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:08:31Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-jrdcz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:43Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:43 crc kubenswrapper[4816]: I0316 00:08:43.375336 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:43 crc kubenswrapper[4816]: I0316 00:08:43.375378 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:43 crc kubenswrapper[4816]: I0316 00:08:43.375390 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:43 crc kubenswrapper[4816]: I0316 00:08:43.375405 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:43 crc kubenswrapper[4816]: I0316 00:08:43.375415 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:43Z","lastTransitionTime":"2026-03-16T00:08:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:43 crc kubenswrapper[4816]: I0316 00:08:43.383001 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-cnhkf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e686cd4-bddf-463e-b471-e49ea862691e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f6c3cacd4d7f6866e66b7234b280540ef2c443531de7a11f6894323ef24a9fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bwzt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:08:31Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-cnhkf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:43Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:43 crc kubenswrapper[4816]: I0316 00:08:43.402988 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://723654bc91ce4d40a27b14a9c4df1ed6c8dff2517f378927d7a6a96aeaa6951b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:43Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:43 crc kubenswrapper[4816]: I0316 00:08:43.418315 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://37869baff974556c22aadadfa4b528b5d6b9ad236107b4dbc945b3cdbf743f20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f9fe727708fdaca68596b7d305629ade45b061a88a81085f885d490ab99e24b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:43Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:43 crc kubenswrapper[4816]: I0316 00:08:43.439883 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-mt7bq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"03ef49f1-0c6a-443a-8df3-2db339c562ed\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d31459f93f54cc1cd99638b1f108b8997b34f246cf8c65e5462e672f04aaa7c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8979f751ec4797986924a3efea84d02bc1c1cf303e1e992f349c9083f2fe4f08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8979f751ec4797986924a3efea84d02bc1c1cf303e1e992f349c9083f2fe4f08\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa87691b0dc314209d88d1f4d5b2227abc1de527e8111831bb8243ba6192c300\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fa87691b0dc314209d88d1f4d5b2227abc1de527e8111831bb8243ba6192c300\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:08:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://33f82b993e3d729a5830e88136e74687c930d6ad1a15c64f003f311fda418bce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://33f82b993e3d729a5830e88136e74687c930d6ad1a15c64f003f311fda418bce\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:08:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b640c26797535db11051d50aefd13b7b2759cfa19d163f53764b0fe8710cefdc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b640c26797535db11051d50aefd13b7b2759cfa19d163f53764b0fe8710cefdc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:08:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe5359762a36b39c9f8ae939a1b9d094c42ffab59c4b9743a83b4391c171a512\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fe5359762a36b39c9f8ae939a1b9d094c42ffab59c4b9743a83b4391c171a512\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:08:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc31f057b776cf1925d265b64c7a9a9fb0993e4f3ffff530a48f28865e1c2954\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dc31f057b776cf1925d265b64c7a9a9fb0993e4f3ffff530a48f28865e1c2954\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:08:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:08:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-mt7bq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:43Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:43 crc kubenswrapper[4816]: I0316 00:08:43.452635 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-lhpbn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6ec6a8ee-efd9-45df-bb35-706fcc90ebe9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a34b33696ef85102800de34359b2f8b4bf11c03ca6347dde766456bcea20e0e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c2x8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:08:38Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-lhpbn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:43Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:43 crc kubenswrapper[4816]: I0316 00:08:43.471745 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c8786d1-025d-4788-bafe-c2c2eaf8e398\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://46db23088e8ba791d736f70a65bd066af80a5ec27c31332d9ac9c52530b21bfd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://da7b257af12b9ee605dc32a26d9565f866a9cafebb4936a0bde7f42ae9909f80\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2523ea43f2490afb81354be8a2840f54a3be1a8d3154196e0c8ac365d09723a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://29c4319e41bd025417b11cb87b682a66698ba7575801f247b1192dc8067ab66f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://29c4319e41bd025417b11cb87b682a66698ba7575801f247b1192dc8067ab66f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-16T00:07:59Z\\\",\\\"message\\\":\\\"le observer\\\\nW0316 00:07:59.373922 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0316 00:07:59.374075 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0316 00:07:59.374860 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2696044728/tls.crt::/tmp/serving-cert-2696044728/tls.key\\\\\\\"\\\\nI0316 00:07:59.560928 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0316 00:07:59.563996 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0316 00:07:59.564013 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0316 00:07:59.564035 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0316 00:07:59.564041 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0316 00:07:59.571475 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0316 00:07:59.571523 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0316 00:07:59.571531 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0316 00:07:59.571540 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0316 00:07:59.571574 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0316 00:07:59.571588 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nI0316 00:07:59.571493 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0316 00:07:59.571597 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0316 00:07:59.573484 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-16T00:07:58Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c81d0c40be9c3912864280cd98481f837ef29f69d85599b39decf5e9d7a97eff\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:50Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e39d816de31fa1c42a254ec61527496e7e85121f67007a774524167b1063cf1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e39d816de31fa1c42a254ec61527496e7e85121f67007a774524167b1063cf1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:06:47Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:43Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:43 crc kubenswrapper[4816]: I0316 00:08:43.477652 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:43 crc kubenswrapper[4816]: I0316 00:08:43.477684 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:43 crc kubenswrapper[4816]: I0316 00:08:43.477692 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:43 crc kubenswrapper[4816]: I0316 00:08:43.477705 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:43 crc kubenswrapper[4816]: I0316 00:08:43.477714 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:43Z","lastTransitionTime":"2026-03-16T00:08:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:43 crc kubenswrapper[4816]: I0316 00:08:43.484294 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://516852108b3d17a15676573ff080d3d94fa48d8076ed0404a8d827e715c2555e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:43Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:43 crc kubenswrapper[4816]: I0316 00:08:43.497945 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:43Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:43 crc kubenswrapper[4816]: I0316 00:08:43.579606 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:43 crc kubenswrapper[4816]: I0316 00:08:43.579659 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:43 crc kubenswrapper[4816]: I0316 00:08:43.579680 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:43 crc kubenswrapper[4816]: I0316 00:08:43.579704 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:43 crc kubenswrapper[4816]: I0316 00:08:43.579717 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:43Z","lastTransitionTime":"2026-03-16T00:08:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:43 crc kubenswrapper[4816]: I0316 00:08:43.681742 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:43 crc kubenswrapper[4816]: I0316 00:08:43.681793 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:43 crc kubenswrapper[4816]: I0316 00:08:43.681830 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:43 crc kubenswrapper[4816]: I0316 00:08:43.681881 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:43 crc kubenswrapper[4816]: I0316 00:08:43.681896 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:43Z","lastTransitionTime":"2026-03-16T00:08:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:43 crc kubenswrapper[4816]: I0316 00:08:43.785915 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:43 crc kubenswrapper[4816]: I0316 00:08:43.786011 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:43 crc kubenswrapper[4816]: I0316 00:08:43.786062 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:43 crc kubenswrapper[4816]: I0316 00:08:43.786112 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:43 crc kubenswrapper[4816]: I0316 00:08:43.786138 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:43Z","lastTransitionTime":"2026-03-16T00:08:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:43 crc kubenswrapper[4816]: I0316 00:08:43.889014 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:43 crc kubenswrapper[4816]: I0316 00:08:43.889058 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:43 crc kubenswrapper[4816]: I0316 00:08:43.889073 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:43 crc kubenswrapper[4816]: I0316 00:08:43.889095 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:43 crc kubenswrapper[4816]: I0316 00:08:43.889111 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:43Z","lastTransitionTime":"2026-03-16T00:08:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:43 crc kubenswrapper[4816]: I0316 00:08:43.991922 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:43 crc kubenswrapper[4816]: I0316 00:08:43.991970 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:43 crc kubenswrapper[4816]: I0316 00:08:43.991981 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:43 crc kubenswrapper[4816]: I0316 00:08:43.992012 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:43 crc kubenswrapper[4816]: I0316 00:08:43.992025 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:43Z","lastTransitionTime":"2026-03-16T00:08:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:44 crc kubenswrapper[4816]: I0316 00:08:44.095657 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:44 crc kubenswrapper[4816]: I0316 00:08:44.095727 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:44 crc kubenswrapper[4816]: I0316 00:08:44.095742 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:44 crc kubenswrapper[4816]: I0316 00:08:44.095761 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:44 crc kubenswrapper[4816]: I0316 00:08:44.095800 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:44Z","lastTransitionTime":"2026-03-16T00:08:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:44 crc kubenswrapper[4816]: I0316 00:08:44.198441 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:44 crc kubenswrapper[4816]: I0316 00:08:44.198483 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:44 crc kubenswrapper[4816]: I0316 00:08:44.198496 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:44 crc kubenswrapper[4816]: I0316 00:08:44.198512 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:44 crc kubenswrapper[4816]: I0316 00:08:44.198523 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:44Z","lastTransitionTime":"2026-03-16T00:08:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:44 crc kubenswrapper[4816]: I0316 00:08:44.246668 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-psjs7_2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88/ovnkube-controller/1.log" Mar 16 00:08:44 crc kubenswrapper[4816]: I0316 00:08:44.247371 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-psjs7_2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88/ovnkube-controller/0.log" Mar 16 00:08:44 crc kubenswrapper[4816]: I0316 00:08:44.250365 4816 generic.go:334] "Generic (PLEG): container finished" podID="2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88" containerID="f8dc5c9e05d9b292469e8d707f3727adbf33725ea339bcfefead2bbdbb094400" exitCode=1 Mar 16 00:08:44 crc kubenswrapper[4816]: I0316 00:08:44.250421 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-psjs7" event={"ID":"2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88","Type":"ContainerDied","Data":"f8dc5c9e05d9b292469e8d707f3727adbf33725ea339bcfefead2bbdbb094400"} Mar 16 00:08:44 crc kubenswrapper[4816]: I0316 00:08:44.250460 4816 scope.go:117] "RemoveContainer" containerID="a2798f27d9234a85bd45aa5d5d9d6b95668aa8f3ac72933b52bb6b2bc9920651" Mar 16 00:08:44 crc kubenswrapper[4816]: I0316 00:08:44.251169 4816 scope.go:117] "RemoveContainer" containerID="f8dc5c9e05d9b292469e8d707f3727adbf33725ea339bcfefead2bbdbb094400" Mar 16 00:08:44 crc kubenswrapper[4816]: E0316 00:08:44.251354 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-psjs7_openshift-ovn-kubernetes(2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88)\"" pod="openshift-ovn-kubernetes/ovnkube-node-psjs7" podUID="2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88" Mar 16 00:08:44 crc kubenswrapper[4816]: I0316 00:08:44.271081 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:44Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:44 crc kubenswrapper[4816]: I0316 00:08:44.287889 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:44Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:44 crc kubenswrapper[4816]: I0316 00:08:44.300979 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:44 crc kubenswrapper[4816]: I0316 00:08:44.301022 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:44 crc kubenswrapper[4816]: I0316 00:08:44.301032 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:44 crc kubenswrapper[4816]: I0316 00:08:44.301045 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:44 crc kubenswrapper[4816]: I0316 00:08:44.301056 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:44Z","lastTransitionTime":"2026-03-16T00:08:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:44 crc kubenswrapper[4816]: I0316 00:08:44.302907 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-jrdcz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dd08ece2-7636-4966-973a-e96a34b70b53\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6dd22d6548986bf126a8ce6b30c7c11d8c37bd0c03539a55cc28fb7debcfc26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trwbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7003a4592a48f87ab63e9f5637c7665040f9784a7c8f599c82ed61270ddb793c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trwbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:08:31Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-jrdcz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:44Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:44 crc kubenswrapper[4816]: I0316 00:08:44.324196 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-psjs7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f47b6c54d5d72d827208f4d8228be260d9d19b4a720b3174349288dee2916641\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e0a510814d60106574e3cd3e612c031ac900caa0f24a12a62d7cd1f6bfda1ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://826495d8ab8f414169bf36159278da0a040fec699a74b4aebdc8f87ccdc48bd6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa5f052d32ad8f52f33ee5baea5af98a64f05b831b86ebced6c9422fb4845fcf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://86c28f1ae2f2fb61b7755ac14bc7f5346036735c75daccaa0e31b4e606cfb4c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0fac0a986edf984b0e34eda0f969205d5bc92f23d0edacb3c6dd8721eaed2fb9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8dc5c9e05d9b292469e8d707f3727adbf33725ea339bcfefead2bbdbb094400\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a2798f27d9234a85bd45aa5d5d9d6b95668aa8f3ac72933b52bb6b2bc9920651\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-16T00:08:41Z\\\",\\\"message\\\":\\\"ng *v1.Namespace event handler 1 for removal\\\\nI0316 00:08:41.339626 6672 reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI0316 00:08:41.339647 6672 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0316 00:08:41.339627 6672 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0316 00:08:41.339804 6672 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0316 00:08:41.339837 6672 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0316 00:08:41.339843 6672 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0316 00:08:41.339871 6672 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0316 00:08:41.340083 6672 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0316 00:08:41.340097 6672 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0316 00:08:41.340108 6672 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0316 00:08:41.340116 6672 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0316 00:08:41.340212 6672 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0316 00:08:41.340278 6672 factory.go:656] Stopping watch factory\\\\nI0316 00:08:41.340297 6672 ovnkube.go:599] Stopped ovnkube\\\\nI0316 0\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:38Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f8dc5c9e05d9b292469e8d707f3727adbf33725ea339bcfefead2bbdbb094400\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-16T00:08:43Z\\\",\\\"message\\\":\\\"0:08:43.170302 6850 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0316 00:08:43.170323 6850 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0316 00:08:43.170342 6850 handler.go:208] Removed *v1.Node event handler 7\\\\nI0316 00:08:43.170346 6850 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0316 00:08:43.170361 6850 handler.go:208] Removed *v1.Node event handler 2\\\\nI0316 00:08:43.170363 6850 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0316 00:08:43.170364 6850 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0316 00:08:43.170402 6850 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0316 00:08:43.170450 6850 factory.go:656] Stopping watch factory\\\\nI0316 00:08:43.170454 6850 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0316 00:08:43.170464 6850 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0316 00:08:43.170464 6850 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0316 00:08:43.170503 6850 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0316 00:08:43.170666 6850 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4d6bf581280690c99ddbbecd4cf4e215a12230d377978aa48acd3c05b85a3c56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1249f8b1c01db98230c10bde49bdb68f17caccc1ba0546e462df4384ca1658dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1249f8b1c01db98230c10bde49bdb68f17caccc1ba0546e462df4384ca1658dd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:08:31Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-psjs7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:44Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:44 crc kubenswrapper[4816]: I0316 00:08:44.349393 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"49e35ef6-7a6d-438f-b598-4445d73db379\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fbe90b49c5d42bb9d6d5d1b8f1d02b571403dd3f39d3118081351d72c4910914\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://508e6dbe6f2f1392d16501e09ee2ec523a3c2a245cd96c6ab773e26e83b8bb6a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://15b5a3223ff0fb5b96e5b87ee836add66498165759f4d6ed8cb6413f091967e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://603bdcbd4a6af5fda7ca09e4823e0db8d7ec3e3eb38eddf9f062a14b433cb094\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8b061f16a65a27b9a5ef66231376b70fec084479a991b7fdd430957df76da32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b19c46be8bb0c6604bb4f702d14472b782280ce716a3b5fe87a4fa4f1b15b174\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b19c46be8bb0c6604bb4f702d14472b782280ce716a3b5fe87a4fa4f1b15b174\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://baceacd0418c0bbcbd2e875008286443b263336ba956d8e4584147ed949b3f69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://baceacd0418c0bbcbd2e875008286443b263336ba956d8e4584147ed949b3f69\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:49Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://24e7adc2ddce8a281cd270db47b3528d2a4965ab6d41a14ecb5aaea02daf0c45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://24e7adc2ddce8a281cd270db47b3528d2a4965ab6d41a14ecb5aaea02daf0c45\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:06:47Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:44Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:44 crc kubenswrapper[4816]: I0316 00:08:44.365926 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-szscw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e9789e58-12c8-4831-9401-af48a3e92209\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e6f0b7e8f95bbbf3b90133349b8b13d2784582a5d95dae8dd159328b570ae962\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mxf6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:08:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-szscw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:44Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:44 crc kubenswrapper[4816]: I0316 00:08:44.381203 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://37869baff974556c22aadadfa4b528b5d6b9ad236107b4dbc945b3cdbf743f20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f9fe727708fdaca68596b7d305629ade45b061a88a81085f885d490ab99e24b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:44Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:44 crc kubenswrapper[4816]: I0316 00:08:44.388056 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-kbgw5"] Mar 16 00:08:44 crc kubenswrapper[4816]: I0316 00:08:44.388670 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-kbgw5" Mar 16 00:08:44 crc kubenswrapper[4816]: I0316 00:08:44.390314 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Mar 16 00:08:44 crc kubenswrapper[4816]: I0316 00:08:44.390894 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Mar 16 00:08:44 crc kubenswrapper[4816]: I0316 00:08:44.392988 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-cnhkf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e686cd4-bddf-463e-b471-e49ea862691e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f6c3cacd4d7f6866e66b7234b280540ef2c443531de7a11f6894323ef24a9fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bwzt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:08:31Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-cnhkf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:44Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:44 crc kubenswrapper[4816]: I0316 00:08:44.402761 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:44 crc kubenswrapper[4816]: I0316 00:08:44.402790 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:44 crc kubenswrapper[4816]: I0316 00:08:44.402802 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:44 crc kubenswrapper[4816]: I0316 00:08:44.402818 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:44 crc kubenswrapper[4816]: I0316 00:08:44.402829 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:44Z","lastTransitionTime":"2026-03-16T00:08:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:44 crc kubenswrapper[4816]: I0316 00:08:44.407865 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://723654bc91ce4d40a27b14a9c4df1ed6c8dff2517f378927d7a6a96aeaa6951b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:44Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:44 crc kubenswrapper[4816]: I0316 00:08:44.420439 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:44Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:44 crc kubenswrapper[4816]: I0316 00:08:44.436411 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-mt7bq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"03ef49f1-0c6a-443a-8df3-2db339c562ed\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d31459f93f54cc1cd99638b1f108b8997b34f246cf8c65e5462e672f04aaa7c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8979f751ec4797986924a3efea84d02bc1c1cf303e1e992f349c9083f2fe4f08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8979f751ec4797986924a3efea84d02bc1c1cf303e1e992f349c9083f2fe4f08\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa87691b0dc314209d88d1f4d5b2227abc1de527e8111831bb8243ba6192c300\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fa87691b0dc314209d88d1f4d5b2227abc1de527e8111831bb8243ba6192c300\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:08:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://33f82b993e3d729a5830e88136e74687c930d6ad1a15c64f003f311fda418bce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://33f82b993e3d729a5830e88136e74687c930d6ad1a15c64f003f311fda418bce\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:08:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b640c26797535db11051d50aefd13b7b2759cfa19d163f53764b0fe8710cefdc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b640c26797535db11051d50aefd13b7b2759cfa19d163f53764b0fe8710cefdc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:08:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe5359762a36b39c9f8ae939a1b9d094c42ffab59c4b9743a83b4391c171a512\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fe5359762a36b39c9f8ae939a1b9d094c42ffab59c4b9743a83b4391c171a512\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:08:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc31f057b776cf1925d265b64c7a9a9fb0993e4f3ffff530a48f28865e1c2954\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dc31f057b776cf1925d265b64c7a9a9fb0993e4f3ffff530a48f28865e1c2954\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:08:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:08:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-mt7bq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:44Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:44 crc kubenswrapper[4816]: I0316 00:08:44.449374 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-lhpbn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6ec6a8ee-efd9-45df-bb35-706fcc90ebe9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a34b33696ef85102800de34359b2f8b4bf11c03ca6347dde766456bcea20e0e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c2x8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:08:38Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-lhpbn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:44Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:44 crc kubenswrapper[4816]: I0316 00:08:44.468724 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c8786d1-025d-4788-bafe-c2c2eaf8e398\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://46db23088e8ba791d736f70a65bd066af80a5ec27c31332d9ac9c52530b21bfd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://da7b257af12b9ee605dc32a26d9565f866a9cafebb4936a0bde7f42ae9909f80\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2523ea43f2490afb81354be8a2840f54a3be1a8d3154196e0c8ac365d09723a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://29c4319e41bd025417b11cb87b682a66698ba7575801f247b1192dc8067ab66f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://29c4319e41bd025417b11cb87b682a66698ba7575801f247b1192dc8067ab66f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-16T00:07:59Z\\\",\\\"message\\\":\\\"le observer\\\\nW0316 00:07:59.373922 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0316 00:07:59.374075 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0316 00:07:59.374860 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2696044728/tls.crt::/tmp/serving-cert-2696044728/tls.key\\\\\\\"\\\\nI0316 00:07:59.560928 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0316 00:07:59.563996 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0316 00:07:59.564013 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0316 00:07:59.564035 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0316 00:07:59.564041 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0316 00:07:59.571475 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0316 00:07:59.571523 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0316 00:07:59.571531 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0316 00:07:59.571540 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0316 00:07:59.571574 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0316 00:07:59.571588 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nI0316 00:07:59.571493 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0316 00:07:59.571597 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0316 00:07:59.573484 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-16T00:07:58Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c81d0c40be9c3912864280cd98481f837ef29f69d85599b39decf5e9d7a97eff\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:50Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e39d816de31fa1c42a254ec61527496e7e85121f67007a774524167b1063cf1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e39d816de31fa1c42a254ec61527496e7e85121f67007a774524167b1063cf1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:06:47Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:44Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:44 crc kubenswrapper[4816]: I0316 00:08:44.485840 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://516852108b3d17a15676573ff080d3d94fa48d8076ed0404a8d827e715c2555e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:44Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:44 crc kubenswrapper[4816]: I0316 00:08:44.505593 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:44 crc kubenswrapper[4816]: I0316 00:08:44.505647 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:44 crc kubenswrapper[4816]: I0316 00:08:44.505661 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:44 crc kubenswrapper[4816]: I0316 00:08:44.505685 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:44 crc kubenswrapper[4816]: I0316 00:08:44.505700 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:44Z","lastTransitionTime":"2026-03-16T00:08:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:44 crc kubenswrapper[4816]: I0316 00:08:44.516495 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"49e35ef6-7a6d-438f-b598-4445d73db379\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fbe90b49c5d42bb9d6d5d1b8f1d02b571403dd3f39d3118081351d72c4910914\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://508e6dbe6f2f1392d16501e09ee2ec523a3c2a245cd96c6ab773e26e83b8bb6a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://15b5a3223ff0fb5b96e5b87ee836add66498165759f4d6ed8cb6413f091967e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://603bdcbd4a6af5fda7ca09e4823e0db8d7ec3e3eb38eddf9f062a14b433cb094\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8b061f16a65a27b9a5ef66231376b70fec084479a991b7fdd430957df76da32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b19c46be8bb0c6604bb4f702d14472b782280ce716a3b5fe87a4fa4f1b15b174\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b19c46be8bb0c6604bb4f702d14472b782280ce716a3b5fe87a4fa4f1b15b174\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://baceacd0418c0bbcbd2e875008286443b263336ba956d8e4584147ed949b3f69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://baceacd0418c0bbcbd2e875008286443b263336ba956d8e4584147ed949b3f69\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:49Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://24e7adc2ddce8a281cd270db47b3528d2a4965ab6d41a14ecb5aaea02daf0c45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://24e7adc2ddce8a281cd270db47b3528d2a4965ab6d41a14ecb5aaea02daf0c45\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:06:47Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:44Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:44 crc kubenswrapper[4816]: I0316 00:08:44.529039 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-szscw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e9789e58-12c8-4831-9401-af48a3e92209\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e6f0b7e8f95bbbf3b90133349b8b13d2784582a5d95dae8dd159328b570ae962\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mxf6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:08:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-szscw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:44Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:44 crc kubenswrapper[4816]: I0316 00:08:44.540843 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-jrdcz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dd08ece2-7636-4966-973a-e96a34b70b53\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6dd22d6548986bf126a8ce6b30c7c11d8c37bd0c03539a55cc28fb7debcfc26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trwbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7003a4592a48f87ab63e9f5637c7665040f9784a7c8f599c82ed61270ddb793c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trwbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:08:31Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-jrdcz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:44Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:44 crc kubenswrapper[4816]: I0316 00:08:44.559060 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-psjs7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f47b6c54d5d72d827208f4d8228be260d9d19b4a720b3174349288dee2916641\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e0a510814d60106574e3cd3e612c031ac900caa0f24a12a62d7cd1f6bfda1ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://826495d8ab8f414169bf36159278da0a040fec699a74b4aebdc8f87ccdc48bd6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa5f052d32ad8f52f33ee5baea5af98a64f05b831b86ebced6c9422fb4845fcf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://86c28f1ae2f2fb61b7755ac14bc7f5346036735c75daccaa0e31b4e606cfb4c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0fac0a986edf984b0e34eda0f969205d5bc92f23d0edacb3c6dd8721eaed2fb9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8dc5c9e05d9b292469e8d707f3727adbf33725ea339bcfefead2bbdbb094400\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a2798f27d9234a85bd45aa5d5d9d6b95668aa8f3ac72933b52bb6b2bc9920651\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-16T00:08:41Z\\\",\\\"message\\\":\\\"ng *v1.Namespace event handler 1 for removal\\\\nI0316 00:08:41.339626 6672 reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI0316 00:08:41.339647 6672 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0316 00:08:41.339627 6672 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0316 00:08:41.339804 6672 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0316 00:08:41.339837 6672 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0316 00:08:41.339843 6672 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0316 00:08:41.339871 6672 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0316 00:08:41.340083 6672 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0316 00:08:41.340097 6672 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0316 00:08:41.340108 6672 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0316 00:08:41.340116 6672 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0316 00:08:41.340212 6672 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0316 00:08:41.340278 6672 factory.go:656] Stopping watch factory\\\\nI0316 00:08:41.340297 6672 ovnkube.go:599] Stopped ovnkube\\\\nI0316 0\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:38Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f8dc5c9e05d9b292469e8d707f3727adbf33725ea339bcfefead2bbdbb094400\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-16T00:08:43Z\\\",\\\"message\\\":\\\"0:08:43.170302 6850 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0316 00:08:43.170323 6850 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0316 00:08:43.170342 6850 handler.go:208] Removed *v1.Node event handler 7\\\\nI0316 00:08:43.170346 6850 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0316 00:08:43.170361 6850 handler.go:208] Removed *v1.Node event handler 2\\\\nI0316 00:08:43.170363 6850 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0316 00:08:43.170364 6850 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0316 00:08:43.170402 6850 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0316 00:08:43.170450 6850 factory.go:656] Stopping watch factory\\\\nI0316 00:08:43.170454 6850 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0316 00:08:43.170464 6850 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0316 00:08:43.170464 6850 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0316 00:08:43.170503 6850 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0316 00:08:43.170666 6850 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4d6bf581280690c99ddbbecd4cf4e215a12230d377978aa48acd3c05b85a3c56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1249f8b1c01db98230c10bde49bdb68f17caccc1ba0546e462df4384ca1658dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1249f8b1c01db98230c10bde49bdb68f17caccc1ba0546e462df4384ca1658dd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:08:31Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-psjs7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:44Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:44 crc kubenswrapper[4816]: I0316 00:08:44.573395 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://723654bc91ce4d40a27b14a9c4df1ed6c8dff2517f378927d7a6a96aeaa6951b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:44Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:44 crc kubenswrapper[4816]: I0316 00:08:44.582086 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/7b28986d-e33b-4876-ab6d-64d69960fb8b-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-kbgw5\" (UID: \"7b28986d-e33b-4876-ab6d-64d69960fb8b\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-kbgw5" Mar 16 00:08:44 crc kubenswrapper[4816]: I0316 00:08:44.582135 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9zgr7\" (UniqueName: \"kubernetes.io/projected/7b28986d-e33b-4876-ab6d-64d69960fb8b-kube-api-access-9zgr7\") pod \"ovnkube-control-plane-749d76644c-kbgw5\" (UID: \"7b28986d-e33b-4876-ab6d-64d69960fb8b\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-kbgw5" Mar 16 00:08:44 crc kubenswrapper[4816]: I0316 00:08:44.582168 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/7b28986d-e33b-4876-ab6d-64d69960fb8b-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-kbgw5\" (UID: \"7b28986d-e33b-4876-ab6d-64d69960fb8b\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-kbgw5" Mar 16 00:08:44 crc kubenswrapper[4816]: I0316 00:08:44.582197 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/7b28986d-e33b-4876-ab6d-64d69960fb8b-env-overrides\") pod \"ovnkube-control-plane-749d76644c-kbgw5\" (UID: \"7b28986d-e33b-4876-ab6d-64d69960fb8b\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-kbgw5" Mar 16 00:08:44 crc kubenswrapper[4816]: I0316 00:08:44.588195 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://37869baff974556c22aadadfa4b528b5d6b9ad236107b4dbc945b3cdbf743f20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f9fe727708fdaca68596b7d305629ade45b061a88a81085f885d490ab99e24b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:44Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:44 crc kubenswrapper[4816]: I0316 00:08:44.601908 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-cnhkf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e686cd4-bddf-463e-b471-e49ea862691e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f6c3cacd4d7f6866e66b7234b280540ef2c443531de7a11f6894323ef24a9fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bwzt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:08:31Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-cnhkf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:44Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:44 crc kubenswrapper[4816]: I0316 00:08:44.607652 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:44 crc kubenswrapper[4816]: I0316 00:08:44.607685 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:44 crc kubenswrapper[4816]: I0316 00:08:44.607696 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:44 crc kubenswrapper[4816]: I0316 00:08:44.607713 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:44 crc kubenswrapper[4816]: I0316 00:08:44.607725 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:44Z","lastTransitionTime":"2026-03-16T00:08:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:44 crc kubenswrapper[4816]: I0316 00:08:44.615720 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-kbgw5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7b28986d-e33b-4876-ab6d-64d69960fb8b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:44Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:44Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zgr7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zgr7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:08:44Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-kbgw5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:44Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:44 crc kubenswrapper[4816]: I0316 00:08:44.638700 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c8786d1-025d-4788-bafe-c2c2eaf8e398\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://46db23088e8ba791d736f70a65bd066af80a5ec27c31332d9ac9c52530b21bfd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://da7b257af12b9ee605dc32a26d9565f866a9cafebb4936a0bde7f42ae9909f80\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2523ea43f2490afb81354be8a2840f54a3be1a8d3154196e0c8ac365d09723a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://29c4319e41bd025417b11cb87b682a66698ba7575801f247b1192dc8067ab66f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://29c4319e41bd025417b11cb87b682a66698ba7575801f247b1192dc8067ab66f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-16T00:07:59Z\\\",\\\"message\\\":\\\"le observer\\\\nW0316 00:07:59.373922 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0316 00:07:59.374075 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0316 00:07:59.374860 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2696044728/tls.crt::/tmp/serving-cert-2696044728/tls.key\\\\\\\"\\\\nI0316 00:07:59.560928 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0316 00:07:59.563996 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0316 00:07:59.564013 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0316 00:07:59.564035 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0316 00:07:59.564041 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0316 00:07:59.571475 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0316 00:07:59.571523 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0316 00:07:59.571531 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0316 00:07:59.571540 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0316 00:07:59.571574 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0316 00:07:59.571588 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nI0316 00:07:59.571493 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0316 00:07:59.571597 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0316 00:07:59.573484 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-16T00:07:58Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c81d0c40be9c3912864280cd98481f837ef29f69d85599b39decf5e9d7a97eff\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:50Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e39d816de31fa1c42a254ec61527496e7e85121f67007a774524167b1063cf1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e39d816de31fa1c42a254ec61527496e7e85121f67007a774524167b1063cf1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:06:47Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:44Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:44 crc kubenswrapper[4816]: I0316 00:08:44.653361 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://516852108b3d17a15676573ff080d3d94fa48d8076ed0404a8d827e715c2555e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:44Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:44 crc kubenswrapper[4816]: I0316 00:08:44.666781 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 16 00:08:44 crc kubenswrapper[4816]: I0316 00:08:44.666806 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 16 00:08:44 crc kubenswrapper[4816]: I0316 00:08:44.666781 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 16 00:08:44 crc kubenswrapper[4816]: E0316 00:08:44.666915 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 16 00:08:44 crc kubenswrapper[4816]: E0316 00:08:44.666984 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 16 00:08:44 crc kubenswrapper[4816]: E0316 00:08:44.667083 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 16 00:08:44 crc kubenswrapper[4816]: I0316 00:08:44.668643 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:44Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:44 crc kubenswrapper[4816]: I0316 00:08:44.683193 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9zgr7\" (UniqueName: \"kubernetes.io/projected/7b28986d-e33b-4876-ab6d-64d69960fb8b-kube-api-access-9zgr7\") pod \"ovnkube-control-plane-749d76644c-kbgw5\" (UID: \"7b28986d-e33b-4876-ab6d-64d69960fb8b\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-kbgw5" Mar 16 00:08:44 crc kubenswrapper[4816]: I0316 00:08:44.683247 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/7b28986d-e33b-4876-ab6d-64d69960fb8b-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-kbgw5\" (UID: \"7b28986d-e33b-4876-ab6d-64d69960fb8b\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-kbgw5" Mar 16 00:08:44 crc kubenswrapper[4816]: I0316 00:08:44.683303 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/7b28986d-e33b-4876-ab6d-64d69960fb8b-env-overrides\") pod \"ovnkube-control-plane-749d76644c-kbgw5\" (UID: \"7b28986d-e33b-4876-ab6d-64d69960fb8b\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-kbgw5" Mar 16 00:08:44 crc kubenswrapper[4816]: I0316 00:08:44.683366 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/7b28986d-e33b-4876-ab6d-64d69960fb8b-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-kbgw5\" (UID: \"7b28986d-e33b-4876-ab6d-64d69960fb8b\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-kbgw5" Mar 16 00:08:44 crc kubenswrapper[4816]: I0316 00:08:44.684331 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/7b28986d-e33b-4876-ab6d-64d69960fb8b-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-kbgw5\" (UID: \"7b28986d-e33b-4876-ab6d-64d69960fb8b\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-kbgw5" Mar 16 00:08:44 crc kubenswrapper[4816]: I0316 00:08:44.684347 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/7b28986d-e33b-4876-ab6d-64d69960fb8b-env-overrides\") pod \"ovnkube-control-plane-749d76644c-kbgw5\" (UID: \"7b28986d-e33b-4876-ab6d-64d69960fb8b\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-kbgw5" Mar 16 00:08:44 crc kubenswrapper[4816]: I0316 00:08:44.684715 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-mt7bq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"03ef49f1-0c6a-443a-8df3-2db339c562ed\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d31459f93f54cc1cd99638b1f108b8997b34f246cf8c65e5462e672f04aaa7c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8979f751ec4797986924a3efea84d02bc1c1cf303e1e992f349c9083f2fe4f08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8979f751ec4797986924a3efea84d02bc1c1cf303e1e992f349c9083f2fe4f08\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa87691b0dc314209d88d1f4d5b2227abc1de527e8111831bb8243ba6192c300\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fa87691b0dc314209d88d1f4d5b2227abc1de527e8111831bb8243ba6192c300\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:08:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://33f82b993e3d729a5830e88136e74687c930d6ad1a15c64f003f311fda418bce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://33f82b993e3d729a5830e88136e74687c930d6ad1a15c64f003f311fda418bce\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:08:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b640c26797535db11051d50aefd13b7b2759cfa19d163f53764b0fe8710cefdc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b640c26797535db11051d50aefd13b7b2759cfa19d163f53764b0fe8710cefdc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:08:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe5359762a36b39c9f8ae939a1b9d094c42ffab59c4b9743a83b4391c171a512\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fe5359762a36b39c9f8ae939a1b9d094c42ffab59c4b9743a83b4391c171a512\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:08:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc31f057b776cf1925d265b64c7a9a9fb0993e4f3ffff530a48f28865e1c2954\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dc31f057b776cf1925d265b64c7a9a9fb0993e4f3ffff530a48f28865e1c2954\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:08:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:08:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-mt7bq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:44Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:44 crc kubenswrapper[4816]: I0316 00:08:44.690399 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/7b28986d-e33b-4876-ab6d-64d69960fb8b-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-kbgw5\" (UID: \"7b28986d-e33b-4876-ab6d-64d69960fb8b\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-kbgw5" Mar 16 00:08:44 crc kubenswrapper[4816]: I0316 00:08:44.699871 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-lhpbn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6ec6a8ee-efd9-45df-bb35-706fcc90ebe9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a34b33696ef85102800de34359b2f8b4bf11c03ca6347dde766456bcea20e0e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c2x8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:08:38Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-lhpbn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:44Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:44 crc kubenswrapper[4816]: I0316 00:08:44.703500 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9zgr7\" (UniqueName: \"kubernetes.io/projected/7b28986d-e33b-4876-ab6d-64d69960fb8b-kube-api-access-9zgr7\") pod \"ovnkube-control-plane-749d76644c-kbgw5\" (UID: \"7b28986d-e33b-4876-ab6d-64d69960fb8b\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-kbgw5" Mar 16 00:08:44 crc kubenswrapper[4816]: I0316 00:08:44.707112 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-kbgw5" Mar 16 00:08:44 crc kubenswrapper[4816]: I0316 00:08:44.709908 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:44 crc kubenswrapper[4816]: I0316 00:08:44.710001 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:44 crc kubenswrapper[4816]: I0316 00:08:44.710216 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:44 crc kubenswrapper[4816]: I0316 00:08:44.710413 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:44 crc kubenswrapper[4816]: I0316 00:08:44.710616 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:44Z","lastTransitionTime":"2026-03-16T00:08:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:44 crc kubenswrapper[4816]: I0316 00:08:44.715696 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:44Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:44 crc kubenswrapper[4816]: I0316 00:08:44.737689 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:44Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:44 crc kubenswrapper[4816]: I0316 00:08:44.812746 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:44 crc kubenswrapper[4816]: I0316 00:08:44.812780 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:44 crc kubenswrapper[4816]: I0316 00:08:44.812791 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:44 crc kubenswrapper[4816]: I0316 00:08:44.812807 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:44 crc kubenswrapper[4816]: I0316 00:08:44.812820 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:44Z","lastTransitionTime":"2026-03-16T00:08:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:44 crc kubenswrapper[4816]: I0316 00:08:44.915474 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:44 crc kubenswrapper[4816]: I0316 00:08:44.915520 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:44 crc kubenswrapper[4816]: I0316 00:08:44.915533 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:44 crc kubenswrapper[4816]: I0316 00:08:44.915590 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:44 crc kubenswrapper[4816]: I0316 00:08:44.915603 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:44Z","lastTransitionTime":"2026-03-16T00:08:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:45 crc kubenswrapper[4816]: I0316 00:08:45.019056 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:45 crc kubenswrapper[4816]: I0316 00:08:45.019120 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:45 crc kubenswrapper[4816]: I0316 00:08:45.019134 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:45 crc kubenswrapper[4816]: I0316 00:08:45.019162 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:45 crc kubenswrapper[4816]: I0316 00:08:45.019178 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:45Z","lastTransitionTime":"2026-03-16T00:08:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:45 crc kubenswrapper[4816]: I0316 00:08:45.122396 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:45 crc kubenswrapper[4816]: I0316 00:08:45.122443 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:45 crc kubenswrapper[4816]: I0316 00:08:45.122455 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:45 crc kubenswrapper[4816]: I0316 00:08:45.122475 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:45 crc kubenswrapper[4816]: I0316 00:08:45.122488 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:45Z","lastTransitionTime":"2026-03-16T00:08:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:45 crc kubenswrapper[4816]: I0316 00:08:45.136912 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/network-metrics-daemon-jqsjn"] Mar 16 00:08:45 crc kubenswrapper[4816]: I0316 00:08:45.137890 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jqsjn" Mar 16 00:08:45 crc kubenswrapper[4816]: E0316 00:08:45.138051 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jqsjn" podUID="84360ef9-0450-44c5-80eb-eab1bf8e808b" Mar 16 00:08:45 crc kubenswrapper[4816]: I0316 00:08:45.157332 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-lhpbn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6ec6a8ee-efd9-45df-bb35-706fcc90ebe9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a34b33696ef85102800de34359b2f8b4bf11c03ca6347dde766456bcea20e0e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c2x8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:08:38Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-lhpbn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:45Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:45 crc kubenswrapper[4816]: I0316 00:08:45.185161 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c8786d1-025d-4788-bafe-c2c2eaf8e398\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://46db23088e8ba791d736f70a65bd066af80a5ec27c31332d9ac9c52530b21bfd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://da7b257af12b9ee605dc32a26d9565f866a9cafebb4936a0bde7f42ae9909f80\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2523ea43f2490afb81354be8a2840f54a3be1a8d3154196e0c8ac365d09723a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://29c4319e41bd025417b11cb87b682a66698ba7575801f247b1192dc8067ab66f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://29c4319e41bd025417b11cb87b682a66698ba7575801f247b1192dc8067ab66f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-16T00:07:59Z\\\",\\\"message\\\":\\\"le observer\\\\nW0316 00:07:59.373922 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0316 00:07:59.374075 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0316 00:07:59.374860 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2696044728/tls.crt::/tmp/serving-cert-2696044728/tls.key\\\\\\\"\\\\nI0316 00:07:59.560928 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0316 00:07:59.563996 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0316 00:07:59.564013 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0316 00:07:59.564035 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0316 00:07:59.564041 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0316 00:07:59.571475 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0316 00:07:59.571523 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0316 00:07:59.571531 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0316 00:07:59.571540 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0316 00:07:59.571574 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0316 00:07:59.571588 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nI0316 00:07:59.571493 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0316 00:07:59.571597 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0316 00:07:59.573484 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-16T00:07:58Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c81d0c40be9c3912864280cd98481f837ef29f69d85599b39decf5e9d7a97eff\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:50Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e39d816de31fa1c42a254ec61527496e7e85121f67007a774524167b1063cf1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e39d816de31fa1c42a254ec61527496e7e85121f67007a774524167b1063cf1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:06:47Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:45Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:45 crc kubenswrapper[4816]: I0316 00:08:45.204651 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://516852108b3d17a15676573ff080d3d94fa48d8076ed0404a8d827e715c2555e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:45Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:45 crc kubenswrapper[4816]: I0316 00:08:45.220175 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:45Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:45 crc kubenswrapper[4816]: I0316 00:08:45.224183 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:45 crc kubenswrapper[4816]: I0316 00:08:45.224215 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:45 crc kubenswrapper[4816]: I0316 00:08:45.224226 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:45 crc kubenswrapper[4816]: I0316 00:08:45.224241 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:45 crc kubenswrapper[4816]: I0316 00:08:45.224254 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:45Z","lastTransitionTime":"2026-03-16T00:08:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:45 crc kubenswrapper[4816]: I0316 00:08:45.234880 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-mt7bq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"03ef49f1-0c6a-443a-8df3-2db339c562ed\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d31459f93f54cc1cd99638b1f108b8997b34f246cf8c65e5462e672f04aaa7c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8979f751ec4797986924a3efea84d02bc1c1cf303e1e992f349c9083f2fe4f08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8979f751ec4797986924a3efea84d02bc1c1cf303e1e992f349c9083f2fe4f08\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa87691b0dc314209d88d1f4d5b2227abc1de527e8111831bb8243ba6192c300\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fa87691b0dc314209d88d1f4d5b2227abc1de527e8111831bb8243ba6192c300\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:08:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://33f82b993e3d729a5830e88136e74687c930d6ad1a15c64f003f311fda418bce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://33f82b993e3d729a5830e88136e74687c930d6ad1a15c64f003f311fda418bce\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:08:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b640c26797535db11051d50aefd13b7b2759cfa19d163f53764b0fe8710cefdc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b640c26797535db11051d50aefd13b7b2759cfa19d163f53764b0fe8710cefdc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:08:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe5359762a36b39c9f8ae939a1b9d094c42ffab59c4b9743a83b4391c171a512\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fe5359762a36b39c9f8ae939a1b9d094c42ffab59c4b9743a83b4391c171a512\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:08:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc31f057b776cf1925d265b64c7a9a9fb0993e4f3ffff530a48f28865e1c2954\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dc31f057b776cf1925d265b64c7a9a9fb0993e4f3ffff530a48f28865e1c2954\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:08:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:08:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-mt7bq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:45Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:45 crc kubenswrapper[4816]: I0316 00:08:45.243607 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-jqsjn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"84360ef9-0450-44c5-80eb-eab1bf8e808b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxldb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxldb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:08:45Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-jqsjn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:45Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:45 crc kubenswrapper[4816]: I0316 00:08:45.254161 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-psjs7_2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88/ovnkube-controller/1.log" Mar 16 00:08:45 crc kubenswrapper[4816]: I0316 00:08:45.256667 4816 scope.go:117] "RemoveContainer" containerID="f8dc5c9e05d9b292469e8d707f3727adbf33725ea339bcfefead2bbdbb094400" Mar 16 00:08:45 crc kubenswrapper[4816]: E0316 00:08:45.256838 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-psjs7_openshift-ovn-kubernetes(2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88)\"" pod="openshift-ovn-kubernetes/ovnkube-node-psjs7" podUID="2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88" Mar 16 00:08:45 crc kubenswrapper[4816]: I0316 00:08:45.258025 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:45Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:45 crc kubenswrapper[4816]: I0316 00:08:45.258166 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-kbgw5" event={"ID":"7b28986d-e33b-4876-ab6d-64d69960fb8b","Type":"ContainerStarted","Data":"0beb1aba8b8813abd98809200fa4dd348377427596c2935ceb012361634df2b4"} Mar 16 00:08:45 crc kubenswrapper[4816]: I0316 00:08:45.258199 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-kbgw5" event={"ID":"7b28986d-e33b-4876-ab6d-64d69960fb8b","Type":"ContainerStarted","Data":"544bab5013b513f755dd9fa88cbc71021b0cfe823aa7508af994c0b1342c3646"} Mar 16 00:08:45 crc kubenswrapper[4816]: I0316 00:08:45.258211 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-kbgw5" event={"ID":"7b28986d-e33b-4876-ab6d-64d69960fb8b","Type":"ContainerStarted","Data":"4c04c3194b0c2e512d22c356071f25ccde0add5c85d2b6122133e335f8944a82"} Mar 16 00:08:45 crc kubenswrapper[4816]: I0316 00:08:45.276993 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:45Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:45 crc kubenswrapper[4816]: I0316 00:08:45.290098 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/84360ef9-0450-44c5-80eb-eab1bf8e808b-metrics-certs\") pod \"network-metrics-daemon-jqsjn\" (UID: \"84360ef9-0450-44c5-80eb-eab1bf8e808b\") " pod="openshift-multus/network-metrics-daemon-jqsjn" Mar 16 00:08:45 crc kubenswrapper[4816]: I0316 00:08:45.290161 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pxldb\" (UniqueName: \"kubernetes.io/projected/84360ef9-0450-44c5-80eb-eab1bf8e808b-kube-api-access-pxldb\") pod \"network-metrics-daemon-jqsjn\" (UID: \"84360ef9-0450-44c5-80eb-eab1bf8e808b\") " pod="openshift-multus/network-metrics-daemon-jqsjn" Mar 16 00:08:45 crc kubenswrapper[4816]: I0316 00:08:45.297417 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"49e35ef6-7a6d-438f-b598-4445d73db379\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fbe90b49c5d42bb9d6d5d1b8f1d02b571403dd3f39d3118081351d72c4910914\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://508e6dbe6f2f1392d16501e09ee2ec523a3c2a245cd96c6ab773e26e83b8bb6a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://15b5a3223ff0fb5b96e5b87ee836add66498165759f4d6ed8cb6413f091967e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://603bdcbd4a6af5fda7ca09e4823e0db8d7ec3e3eb38eddf9f062a14b433cb094\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8b061f16a65a27b9a5ef66231376b70fec084479a991b7fdd430957df76da32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b19c46be8bb0c6604bb4f702d14472b782280ce716a3b5fe87a4fa4f1b15b174\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b19c46be8bb0c6604bb4f702d14472b782280ce716a3b5fe87a4fa4f1b15b174\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://baceacd0418c0bbcbd2e875008286443b263336ba956d8e4584147ed949b3f69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://baceacd0418c0bbcbd2e875008286443b263336ba956d8e4584147ed949b3f69\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:49Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://24e7adc2ddce8a281cd270db47b3528d2a4965ab6d41a14ecb5aaea02daf0c45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://24e7adc2ddce8a281cd270db47b3528d2a4965ab6d41a14ecb5aaea02daf0c45\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:06:47Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:45Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:45 crc kubenswrapper[4816]: I0316 00:08:45.309094 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-szscw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e9789e58-12c8-4831-9401-af48a3e92209\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e6f0b7e8f95bbbf3b90133349b8b13d2784582a5d95dae8dd159328b570ae962\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mxf6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:08:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-szscw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:45Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:45 crc kubenswrapper[4816]: I0316 00:08:45.319499 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-jrdcz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dd08ece2-7636-4966-973a-e96a34b70b53\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6dd22d6548986bf126a8ce6b30c7c11d8c37bd0c03539a55cc28fb7debcfc26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trwbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7003a4592a48f87ab63e9f5637c7665040f9784a7c8f599c82ed61270ddb793c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trwbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:08:31Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-jrdcz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:45Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:45 crc kubenswrapper[4816]: I0316 00:08:45.326468 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:45 crc kubenswrapper[4816]: I0316 00:08:45.326516 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:45 crc kubenswrapper[4816]: I0316 00:08:45.326530 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:45 crc kubenswrapper[4816]: I0316 00:08:45.326566 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:45 crc kubenswrapper[4816]: I0316 00:08:45.326578 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:45Z","lastTransitionTime":"2026-03-16T00:08:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:45 crc kubenswrapper[4816]: I0316 00:08:45.347311 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-psjs7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f47b6c54d5d72d827208f4d8228be260d9d19b4a720b3174349288dee2916641\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e0a510814d60106574e3cd3e612c031ac900caa0f24a12a62d7cd1f6bfda1ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://826495d8ab8f414169bf36159278da0a040fec699a74b4aebdc8f87ccdc48bd6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa5f052d32ad8f52f33ee5baea5af98a64f05b831b86ebced6c9422fb4845fcf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://86c28f1ae2f2fb61b7755ac14bc7f5346036735c75daccaa0e31b4e606cfb4c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0fac0a986edf984b0e34eda0f969205d5bc92f23d0edacb3c6dd8721eaed2fb9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8dc5c9e05d9b292469e8d707f3727adbf33725ea339bcfefead2bbdbb094400\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a2798f27d9234a85bd45aa5d5d9d6b95668aa8f3ac72933b52bb6b2bc9920651\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-16T00:08:41Z\\\",\\\"message\\\":\\\"ng *v1.Namespace event handler 1 for removal\\\\nI0316 00:08:41.339626 6672 reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI0316 00:08:41.339647 6672 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0316 00:08:41.339627 6672 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0316 00:08:41.339804 6672 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0316 00:08:41.339837 6672 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0316 00:08:41.339843 6672 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0316 00:08:41.339871 6672 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0316 00:08:41.340083 6672 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0316 00:08:41.340097 6672 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0316 00:08:41.340108 6672 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0316 00:08:41.340116 6672 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0316 00:08:41.340212 6672 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0316 00:08:41.340278 6672 factory.go:656] Stopping watch factory\\\\nI0316 00:08:41.340297 6672 ovnkube.go:599] Stopped ovnkube\\\\nI0316 0\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:38Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f8dc5c9e05d9b292469e8d707f3727adbf33725ea339bcfefead2bbdbb094400\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-16T00:08:43Z\\\",\\\"message\\\":\\\"0:08:43.170302 6850 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0316 00:08:43.170323 6850 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0316 00:08:43.170342 6850 handler.go:208] Removed *v1.Node event handler 7\\\\nI0316 00:08:43.170346 6850 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0316 00:08:43.170361 6850 handler.go:208] Removed *v1.Node event handler 2\\\\nI0316 00:08:43.170363 6850 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0316 00:08:43.170364 6850 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0316 00:08:43.170402 6850 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0316 00:08:43.170450 6850 factory.go:656] Stopping watch factory\\\\nI0316 00:08:43.170454 6850 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0316 00:08:43.170464 6850 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0316 00:08:43.170464 6850 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0316 00:08:43.170503 6850 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0316 00:08:43.170666 6850 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4d6bf581280690c99ddbbecd4cf4e215a12230d377978aa48acd3c05b85a3c56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1249f8b1c01db98230c10bde49bdb68f17caccc1ba0546e462df4384ca1658dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1249f8b1c01db98230c10bde49bdb68f17caccc1ba0546e462df4384ca1658dd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:08:31Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-psjs7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:45Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:45 crc kubenswrapper[4816]: I0316 00:08:45.363313 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-kbgw5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7b28986d-e33b-4876-ab6d-64d69960fb8b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:44Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:44Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zgr7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zgr7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:08:44Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-kbgw5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:45Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:45 crc kubenswrapper[4816]: I0316 00:08:45.378642 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://723654bc91ce4d40a27b14a9c4df1ed6c8dff2517f378927d7a6a96aeaa6951b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:45Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:45 crc kubenswrapper[4816]: I0316 00:08:45.390909 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pxldb\" (UniqueName: \"kubernetes.io/projected/84360ef9-0450-44c5-80eb-eab1bf8e808b-kube-api-access-pxldb\") pod \"network-metrics-daemon-jqsjn\" (UID: \"84360ef9-0450-44c5-80eb-eab1bf8e808b\") " pod="openshift-multus/network-metrics-daemon-jqsjn" Mar 16 00:08:45 crc kubenswrapper[4816]: I0316 00:08:45.391070 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/84360ef9-0450-44c5-80eb-eab1bf8e808b-metrics-certs\") pod \"network-metrics-daemon-jqsjn\" (UID: \"84360ef9-0450-44c5-80eb-eab1bf8e808b\") " pod="openshift-multus/network-metrics-daemon-jqsjn" Mar 16 00:08:45 crc kubenswrapper[4816]: E0316 00:08:45.391481 4816 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 16 00:08:45 crc kubenswrapper[4816]: E0316 00:08:45.391595 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/84360ef9-0450-44c5-80eb-eab1bf8e808b-metrics-certs podName:84360ef9-0450-44c5-80eb-eab1bf8e808b nodeName:}" failed. No retries permitted until 2026-03-16 00:08:45.891542213 +0000 UTC m=+118.987842186 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/84360ef9-0450-44c5-80eb-eab1bf8e808b-metrics-certs") pod "network-metrics-daemon-jqsjn" (UID: "84360ef9-0450-44c5-80eb-eab1bf8e808b") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 16 00:08:45 crc kubenswrapper[4816]: I0316 00:08:45.392338 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://37869baff974556c22aadadfa4b528b5d6b9ad236107b4dbc945b3cdbf743f20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f9fe727708fdaca68596b7d305629ade45b061a88a81085f885d490ab99e24b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:45Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:45 crc kubenswrapper[4816]: I0316 00:08:45.402062 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-cnhkf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e686cd4-bddf-463e-b471-e49ea862691e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f6c3cacd4d7f6866e66b7234b280540ef2c443531de7a11f6894323ef24a9fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bwzt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:08:31Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-cnhkf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:45Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:45 crc kubenswrapper[4816]: I0316 00:08:45.410478 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pxldb\" (UniqueName: \"kubernetes.io/projected/84360ef9-0450-44c5-80eb-eab1bf8e808b-kube-api-access-pxldb\") pod \"network-metrics-daemon-jqsjn\" (UID: \"84360ef9-0450-44c5-80eb-eab1bf8e808b\") " pod="openshift-multus/network-metrics-daemon-jqsjn" Mar 16 00:08:45 crc kubenswrapper[4816]: I0316 00:08:45.413507 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:45Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:45 crc kubenswrapper[4816]: I0316 00:08:45.424163 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:45Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:45 crc kubenswrapper[4816]: I0316 00:08:45.428563 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:45 crc kubenswrapper[4816]: I0316 00:08:45.428585 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:45 crc kubenswrapper[4816]: I0316 00:08:45.428594 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:45 crc kubenswrapper[4816]: I0316 00:08:45.428607 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:45 crc kubenswrapper[4816]: I0316 00:08:45.428615 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:45Z","lastTransitionTime":"2026-03-16T00:08:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:45 crc kubenswrapper[4816]: I0316 00:08:45.435857 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-jqsjn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"84360ef9-0450-44c5-80eb-eab1bf8e808b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxldb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxldb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:08:45Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-jqsjn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:45Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:45 crc kubenswrapper[4816]: I0316 00:08:45.454133 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"49e35ef6-7a6d-438f-b598-4445d73db379\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fbe90b49c5d42bb9d6d5d1b8f1d02b571403dd3f39d3118081351d72c4910914\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://508e6dbe6f2f1392d16501e09ee2ec523a3c2a245cd96c6ab773e26e83b8bb6a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://15b5a3223ff0fb5b96e5b87ee836add66498165759f4d6ed8cb6413f091967e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://603bdcbd4a6af5fda7ca09e4823e0db8d7ec3e3eb38eddf9f062a14b433cb094\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8b061f16a65a27b9a5ef66231376b70fec084479a991b7fdd430957df76da32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b19c46be8bb0c6604bb4f702d14472b782280ce716a3b5fe87a4fa4f1b15b174\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b19c46be8bb0c6604bb4f702d14472b782280ce716a3b5fe87a4fa4f1b15b174\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://baceacd0418c0bbcbd2e875008286443b263336ba956d8e4584147ed949b3f69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://baceacd0418c0bbcbd2e875008286443b263336ba956d8e4584147ed949b3f69\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:49Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://24e7adc2ddce8a281cd270db47b3528d2a4965ab6d41a14ecb5aaea02daf0c45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://24e7adc2ddce8a281cd270db47b3528d2a4965ab6d41a14ecb5aaea02daf0c45\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:06:47Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:45Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:45 crc kubenswrapper[4816]: I0316 00:08:45.465416 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-szscw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e9789e58-12c8-4831-9401-af48a3e92209\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e6f0b7e8f95bbbf3b90133349b8b13d2784582a5d95dae8dd159328b570ae962\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mxf6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:08:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-szscw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:45Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:45 crc kubenswrapper[4816]: I0316 00:08:45.476317 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-jrdcz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dd08ece2-7636-4966-973a-e96a34b70b53\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6dd22d6548986bf126a8ce6b30c7c11d8c37bd0c03539a55cc28fb7debcfc26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trwbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7003a4592a48f87ab63e9f5637c7665040f9784a7c8f599c82ed61270ddb793c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trwbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:08:31Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-jrdcz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:45Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:45 crc kubenswrapper[4816]: I0316 00:08:45.492522 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-psjs7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f47b6c54d5d72d827208f4d8228be260d9d19b4a720b3174349288dee2916641\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e0a510814d60106574e3cd3e612c031ac900caa0f24a12a62d7cd1f6bfda1ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://826495d8ab8f414169bf36159278da0a040fec699a74b4aebdc8f87ccdc48bd6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa5f052d32ad8f52f33ee5baea5af98a64f05b831b86ebced6c9422fb4845fcf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://86c28f1ae2f2fb61b7755ac14bc7f5346036735c75daccaa0e31b4e606cfb4c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0fac0a986edf984b0e34eda0f969205d5bc92f23d0edacb3c6dd8721eaed2fb9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8dc5c9e05d9b292469e8d707f3727adbf33725ea339bcfefead2bbdbb094400\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f8dc5c9e05d9b292469e8d707f3727adbf33725ea339bcfefead2bbdbb094400\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-16T00:08:43Z\\\",\\\"message\\\":\\\"0:08:43.170302 6850 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0316 00:08:43.170323 6850 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0316 00:08:43.170342 6850 handler.go:208] Removed *v1.Node event handler 7\\\\nI0316 00:08:43.170346 6850 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0316 00:08:43.170361 6850 handler.go:208] Removed *v1.Node event handler 2\\\\nI0316 00:08:43.170363 6850 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0316 00:08:43.170364 6850 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0316 00:08:43.170402 6850 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0316 00:08:43.170450 6850 factory.go:656] Stopping watch factory\\\\nI0316 00:08:43.170454 6850 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0316 00:08:43.170464 6850 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0316 00:08:43.170464 6850 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0316 00:08:43.170503 6850 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0316 00:08:43.170666 6850 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:42Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-psjs7_openshift-ovn-kubernetes(2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4d6bf581280690c99ddbbecd4cf4e215a12230d377978aa48acd3c05b85a3c56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1249f8b1c01db98230c10bde49bdb68f17caccc1ba0546e462df4384ca1658dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1249f8b1c01db98230c10bde49bdb68f17caccc1ba0546e462df4384ca1658dd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:08:31Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-psjs7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:45Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:45 crc kubenswrapper[4816]: I0316 00:08:45.510694 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://723654bc91ce4d40a27b14a9c4df1ed6c8dff2517f378927d7a6a96aeaa6951b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:45Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:45 crc kubenswrapper[4816]: I0316 00:08:45.525750 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://37869baff974556c22aadadfa4b528b5d6b9ad236107b4dbc945b3cdbf743f20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f9fe727708fdaca68596b7d305629ade45b061a88a81085f885d490ab99e24b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:45Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:45 crc kubenswrapper[4816]: I0316 00:08:45.530851 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:45 crc kubenswrapper[4816]: I0316 00:08:45.530898 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:45 crc kubenswrapper[4816]: I0316 00:08:45.530910 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:45 crc kubenswrapper[4816]: I0316 00:08:45.530928 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:45 crc kubenswrapper[4816]: I0316 00:08:45.530941 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:45Z","lastTransitionTime":"2026-03-16T00:08:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:45 crc kubenswrapper[4816]: I0316 00:08:45.541057 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-cnhkf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e686cd4-bddf-463e-b471-e49ea862691e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f6c3cacd4d7f6866e66b7234b280540ef2c443531de7a11f6894323ef24a9fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bwzt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:08:31Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-cnhkf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:45Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:45 crc kubenswrapper[4816]: I0316 00:08:45.553253 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-kbgw5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7b28986d-e33b-4876-ab6d-64d69960fb8b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://544bab5013b513f755dd9fa88cbc71021b0cfe823aa7508af994c0b1342c3646\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zgr7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0beb1aba8b8813abd98809200fa4dd348377427596c2935ceb012361634df2b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zgr7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:08:44Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-kbgw5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:45Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:45 crc kubenswrapper[4816]: I0316 00:08:45.574433 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c8786d1-025d-4788-bafe-c2c2eaf8e398\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://46db23088e8ba791d736f70a65bd066af80a5ec27c31332d9ac9c52530b21bfd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://da7b257af12b9ee605dc32a26d9565f866a9cafebb4936a0bde7f42ae9909f80\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2523ea43f2490afb81354be8a2840f54a3be1a8d3154196e0c8ac365d09723a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://29c4319e41bd025417b11cb87b682a66698ba7575801f247b1192dc8067ab66f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://29c4319e41bd025417b11cb87b682a66698ba7575801f247b1192dc8067ab66f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-16T00:07:59Z\\\",\\\"message\\\":\\\"le observer\\\\nW0316 00:07:59.373922 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0316 00:07:59.374075 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0316 00:07:59.374860 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2696044728/tls.crt::/tmp/serving-cert-2696044728/tls.key\\\\\\\"\\\\nI0316 00:07:59.560928 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0316 00:07:59.563996 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0316 00:07:59.564013 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0316 00:07:59.564035 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0316 00:07:59.564041 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0316 00:07:59.571475 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0316 00:07:59.571523 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0316 00:07:59.571531 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0316 00:07:59.571540 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0316 00:07:59.571574 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0316 00:07:59.571588 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nI0316 00:07:59.571493 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0316 00:07:59.571597 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0316 00:07:59.573484 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-16T00:07:58Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c81d0c40be9c3912864280cd98481f837ef29f69d85599b39decf5e9d7a97eff\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:50Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e39d816de31fa1c42a254ec61527496e7e85121f67007a774524167b1063cf1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e39d816de31fa1c42a254ec61527496e7e85121f67007a774524167b1063cf1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:06:47Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:45Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:45 crc kubenswrapper[4816]: I0316 00:08:45.587942 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://516852108b3d17a15676573ff080d3d94fa48d8076ed0404a8d827e715c2555e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:45Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:45 crc kubenswrapper[4816]: I0316 00:08:45.601171 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:45Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:45 crc kubenswrapper[4816]: I0316 00:08:45.618051 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-mt7bq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"03ef49f1-0c6a-443a-8df3-2db339c562ed\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d31459f93f54cc1cd99638b1f108b8997b34f246cf8c65e5462e672f04aaa7c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8979f751ec4797986924a3efea84d02bc1c1cf303e1e992f349c9083f2fe4f08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8979f751ec4797986924a3efea84d02bc1c1cf303e1e992f349c9083f2fe4f08\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa87691b0dc314209d88d1f4d5b2227abc1de527e8111831bb8243ba6192c300\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fa87691b0dc314209d88d1f4d5b2227abc1de527e8111831bb8243ba6192c300\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:08:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://33f82b993e3d729a5830e88136e74687c930d6ad1a15c64f003f311fda418bce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://33f82b993e3d729a5830e88136e74687c930d6ad1a15c64f003f311fda418bce\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:08:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b640c26797535db11051d50aefd13b7b2759cfa19d163f53764b0fe8710cefdc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b640c26797535db11051d50aefd13b7b2759cfa19d163f53764b0fe8710cefdc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:08:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe5359762a36b39c9f8ae939a1b9d094c42ffab59c4b9743a83b4391c171a512\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fe5359762a36b39c9f8ae939a1b9d094c42ffab59c4b9743a83b4391c171a512\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:08:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc31f057b776cf1925d265b64c7a9a9fb0993e4f3ffff530a48f28865e1c2954\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dc31f057b776cf1925d265b64c7a9a9fb0993e4f3ffff530a48f28865e1c2954\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:08:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:08:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-mt7bq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:45Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:45 crc kubenswrapper[4816]: I0316 00:08:45.629160 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-lhpbn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6ec6a8ee-efd9-45df-bb35-706fcc90ebe9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a34b33696ef85102800de34359b2f8b4bf11c03ca6347dde766456bcea20e0e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c2x8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:08:38Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-lhpbn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:45Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:45 crc kubenswrapper[4816]: I0316 00:08:45.632924 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:45 crc kubenswrapper[4816]: I0316 00:08:45.632973 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:45 crc kubenswrapper[4816]: I0316 00:08:45.632987 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:45 crc kubenswrapper[4816]: I0316 00:08:45.633010 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:45 crc kubenswrapper[4816]: I0316 00:08:45.633025 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:45Z","lastTransitionTime":"2026-03-16T00:08:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:45 crc kubenswrapper[4816]: I0316 00:08:45.678272 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/openshift-kube-scheduler-crc"] Mar 16 00:08:45 crc kubenswrapper[4816]: I0316 00:08:45.735694 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:45 crc kubenswrapper[4816]: I0316 00:08:45.735755 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:45 crc kubenswrapper[4816]: I0316 00:08:45.735772 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:45 crc kubenswrapper[4816]: I0316 00:08:45.735798 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:45 crc kubenswrapper[4816]: I0316 00:08:45.735817 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:45Z","lastTransitionTime":"2026-03-16T00:08:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:45 crc kubenswrapper[4816]: I0316 00:08:45.839008 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:45 crc kubenswrapper[4816]: I0316 00:08:45.839061 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:45 crc kubenswrapper[4816]: I0316 00:08:45.839077 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:45 crc kubenswrapper[4816]: I0316 00:08:45.839101 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:45 crc kubenswrapper[4816]: I0316 00:08:45.839118 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:45Z","lastTransitionTime":"2026-03-16T00:08:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:45 crc kubenswrapper[4816]: I0316 00:08:45.895414 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/84360ef9-0450-44c5-80eb-eab1bf8e808b-metrics-certs\") pod \"network-metrics-daemon-jqsjn\" (UID: \"84360ef9-0450-44c5-80eb-eab1bf8e808b\") " pod="openshift-multus/network-metrics-daemon-jqsjn" Mar 16 00:08:45 crc kubenswrapper[4816]: E0316 00:08:45.895696 4816 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 16 00:08:45 crc kubenswrapper[4816]: E0316 00:08:45.895795 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/84360ef9-0450-44c5-80eb-eab1bf8e808b-metrics-certs podName:84360ef9-0450-44c5-80eb-eab1bf8e808b nodeName:}" failed. No retries permitted until 2026-03-16 00:08:46.895765188 +0000 UTC m=+119.992065171 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/84360ef9-0450-44c5-80eb-eab1bf8e808b-metrics-certs") pod "network-metrics-daemon-jqsjn" (UID: "84360ef9-0450-44c5-80eb-eab1bf8e808b") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 16 00:08:45 crc kubenswrapper[4816]: I0316 00:08:45.942681 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:45 crc kubenswrapper[4816]: I0316 00:08:45.942741 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:45 crc kubenswrapper[4816]: I0316 00:08:45.942751 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:45 crc kubenswrapper[4816]: I0316 00:08:45.942773 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:45 crc kubenswrapper[4816]: I0316 00:08:45.942787 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:45Z","lastTransitionTime":"2026-03-16T00:08:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:46 crc kubenswrapper[4816]: I0316 00:08:46.045138 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:46 crc kubenswrapper[4816]: I0316 00:08:46.045205 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:46 crc kubenswrapper[4816]: I0316 00:08:46.045222 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:46 crc kubenswrapper[4816]: I0316 00:08:46.045242 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:46 crc kubenswrapper[4816]: I0316 00:08:46.045254 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:46Z","lastTransitionTime":"2026-03-16T00:08:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:46 crc kubenswrapper[4816]: I0316 00:08:46.148625 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:46 crc kubenswrapper[4816]: I0316 00:08:46.148683 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:46 crc kubenswrapper[4816]: I0316 00:08:46.148694 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:46 crc kubenswrapper[4816]: I0316 00:08:46.148719 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:46 crc kubenswrapper[4816]: I0316 00:08:46.148732 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:46Z","lastTransitionTime":"2026-03-16T00:08:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:46 crc kubenswrapper[4816]: I0316 00:08:46.251711 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:46 crc kubenswrapper[4816]: I0316 00:08:46.251769 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:46 crc kubenswrapper[4816]: I0316 00:08:46.251782 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:46 crc kubenswrapper[4816]: I0316 00:08:46.251803 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:46 crc kubenswrapper[4816]: I0316 00:08:46.251820 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:46Z","lastTransitionTime":"2026-03-16T00:08:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:46 crc kubenswrapper[4816]: I0316 00:08:46.354895 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:46 crc kubenswrapper[4816]: I0316 00:08:46.354942 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:46 crc kubenswrapper[4816]: I0316 00:08:46.354956 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:46 crc kubenswrapper[4816]: I0316 00:08:46.354977 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:46 crc kubenswrapper[4816]: I0316 00:08:46.354992 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:46Z","lastTransitionTime":"2026-03-16T00:08:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:46 crc kubenswrapper[4816]: I0316 00:08:46.458035 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:46 crc kubenswrapper[4816]: I0316 00:08:46.458352 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:46 crc kubenswrapper[4816]: I0316 00:08:46.458383 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:46 crc kubenswrapper[4816]: I0316 00:08:46.458412 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:46 crc kubenswrapper[4816]: I0316 00:08:46.458433 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:46Z","lastTransitionTime":"2026-03-16T00:08:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:46 crc kubenswrapper[4816]: I0316 00:08:46.561420 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:46 crc kubenswrapper[4816]: I0316 00:08:46.561472 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:46 crc kubenswrapper[4816]: I0316 00:08:46.561490 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:46 crc kubenswrapper[4816]: I0316 00:08:46.561512 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:46 crc kubenswrapper[4816]: I0316 00:08:46.561532 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:46Z","lastTransitionTime":"2026-03-16T00:08:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:46 crc kubenswrapper[4816]: I0316 00:08:46.664137 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:46 crc kubenswrapper[4816]: I0316 00:08:46.664191 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:46 crc kubenswrapper[4816]: I0316 00:08:46.664209 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:46 crc kubenswrapper[4816]: I0316 00:08:46.664234 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:46 crc kubenswrapper[4816]: I0316 00:08:46.664251 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:46Z","lastTransitionTime":"2026-03-16T00:08:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:46 crc kubenswrapper[4816]: I0316 00:08:46.666772 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 16 00:08:46 crc kubenswrapper[4816]: I0316 00:08:46.666874 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 16 00:08:46 crc kubenswrapper[4816]: I0316 00:08:46.666915 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jqsjn" Mar 16 00:08:46 crc kubenswrapper[4816]: E0316 00:08:46.667106 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 16 00:08:46 crc kubenswrapper[4816]: I0316 00:08:46.667211 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 16 00:08:46 crc kubenswrapper[4816]: E0316 00:08:46.667246 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 16 00:08:46 crc kubenswrapper[4816]: E0316 00:08:46.667424 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 16 00:08:46 crc kubenswrapper[4816]: E0316 00:08:46.667711 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jqsjn" podUID="84360ef9-0450-44c5-80eb-eab1bf8e808b" Mar 16 00:08:46 crc kubenswrapper[4816]: I0316 00:08:46.766920 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:46 crc kubenswrapper[4816]: I0316 00:08:46.766978 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:46 crc kubenswrapper[4816]: I0316 00:08:46.766997 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:46 crc kubenswrapper[4816]: I0316 00:08:46.767034 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:46 crc kubenswrapper[4816]: I0316 00:08:46.767052 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:46Z","lastTransitionTime":"2026-03-16T00:08:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:46 crc kubenswrapper[4816]: I0316 00:08:46.869955 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:46 crc kubenswrapper[4816]: I0316 00:08:46.869996 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:46 crc kubenswrapper[4816]: I0316 00:08:46.870004 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:46 crc kubenswrapper[4816]: I0316 00:08:46.870019 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:46 crc kubenswrapper[4816]: I0316 00:08:46.870031 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:46Z","lastTransitionTime":"2026-03-16T00:08:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:46 crc kubenswrapper[4816]: I0316 00:08:46.905027 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/84360ef9-0450-44c5-80eb-eab1bf8e808b-metrics-certs\") pod \"network-metrics-daemon-jqsjn\" (UID: \"84360ef9-0450-44c5-80eb-eab1bf8e808b\") " pod="openshift-multus/network-metrics-daemon-jqsjn" Mar 16 00:08:46 crc kubenswrapper[4816]: E0316 00:08:46.905171 4816 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 16 00:08:46 crc kubenswrapper[4816]: E0316 00:08:46.905239 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/84360ef9-0450-44c5-80eb-eab1bf8e808b-metrics-certs podName:84360ef9-0450-44c5-80eb-eab1bf8e808b nodeName:}" failed. No retries permitted until 2026-03-16 00:08:48.905223279 +0000 UTC m=+122.001523242 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/84360ef9-0450-44c5-80eb-eab1bf8e808b-metrics-certs") pod "network-metrics-daemon-jqsjn" (UID: "84360ef9-0450-44c5-80eb-eab1bf8e808b") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 16 00:08:46 crc kubenswrapper[4816]: I0316 00:08:46.974036 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:46 crc kubenswrapper[4816]: I0316 00:08:46.974097 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:46 crc kubenswrapper[4816]: I0316 00:08:46.974114 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:46 crc kubenswrapper[4816]: I0316 00:08:46.974140 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:46 crc kubenswrapper[4816]: I0316 00:08:46.974159 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:46Z","lastTransitionTime":"2026-03-16T00:08:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:47 crc kubenswrapper[4816]: I0316 00:08:47.077219 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:47 crc kubenswrapper[4816]: I0316 00:08:47.077274 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:47 crc kubenswrapper[4816]: I0316 00:08:47.077285 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:47 crc kubenswrapper[4816]: I0316 00:08:47.077300 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:47 crc kubenswrapper[4816]: I0316 00:08:47.077309 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:47Z","lastTransitionTime":"2026-03-16T00:08:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:47 crc kubenswrapper[4816]: I0316 00:08:47.180368 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:47 crc kubenswrapper[4816]: I0316 00:08:47.180430 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:47 crc kubenswrapper[4816]: I0316 00:08:47.180463 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:47 crc kubenswrapper[4816]: I0316 00:08:47.180492 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:47 crc kubenswrapper[4816]: I0316 00:08:47.180512 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:47Z","lastTransitionTime":"2026-03-16T00:08:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:47 crc kubenswrapper[4816]: I0316 00:08:47.283736 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:47 crc kubenswrapper[4816]: I0316 00:08:47.283810 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:47 crc kubenswrapper[4816]: I0316 00:08:47.283834 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:47 crc kubenswrapper[4816]: I0316 00:08:47.283863 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:47 crc kubenswrapper[4816]: I0316 00:08:47.283885 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:47Z","lastTransitionTime":"2026-03-16T00:08:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:47 crc kubenswrapper[4816]: I0316 00:08:47.387078 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:47 crc kubenswrapper[4816]: I0316 00:08:47.387136 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:47 crc kubenswrapper[4816]: I0316 00:08:47.387152 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:47 crc kubenswrapper[4816]: I0316 00:08:47.387174 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:47 crc kubenswrapper[4816]: I0316 00:08:47.387191 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:47Z","lastTransitionTime":"2026-03-16T00:08:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:47 crc kubenswrapper[4816]: I0316 00:08:47.489984 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:47 crc kubenswrapper[4816]: I0316 00:08:47.490050 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:47 crc kubenswrapper[4816]: I0316 00:08:47.490076 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:47 crc kubenswrapper[4816]: I0316 00:08:47.490103 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:47 crc kubenswrapper[4816]: I0316 00:08:47.490120 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:47Z","lastTransitionTime":"2026-03-16T00:08:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:47 crc kubenswrapper[4816]: E0316 00:08:47.590894 4816 kubelet_node_status.go:497] "Node not becoming ready in time after startup" Mar 16 00:08:47 crc kubenswrapper[4816]: I0316 00:08:47.689896 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c8786d1-025d-4788-bafe-c2c2eaf8e398\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://46db23088e8ba791d736f70a65bd066af80a5ec27c31332d9ac9c52530b21bfd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://da7b257af12b9ee605dc32a26d9565f866a9cafebb4936a0bde7f42ae9909f80\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2523ea43f2490afb81354be8a2840f54a3be1a8d3154196e0c8ac365d09723a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://29c4319e41bd025417b11cb87b682a66698ba7575801f247b1192dc8067ab66f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://29c4319e41bd025417b11cb87b682a66698ba7575801f247b1192dc8067ab66f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-16T00:07:59Z\\\",\\\"message\\\":\\\"le observer\\\\nW0316 00:07:59.373922 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0316 00:07:59.374075 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0316 00:07:59.374860 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2696044728/tls.crt::/tmp/serving-cert-2696044728/tls.key\\\\\\\"\\\\nI0316 00:07:59.560928 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0316 00:07:59.563996 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0316 00:07:59.564013 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0316 00:07:59.564035 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0316 00:07:59.564041 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0316 00:07:59.571475 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0316 00:07:59.571523 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0316 00:07:59.571531 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0316 00:07:59.571540 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0316 00:07:59.571574 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0316 00:07:59.571588 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nI0316 00:07:59.571493 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0316 00:07:59.571597 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0316 00:07:59.573484 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-16T00:07:58Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c81d0c40be9c3912864280cd98481f837ef29f69d85599b39decf5e9d7a97eff\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:50Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e39d816de31fa1c42a254ec61527496e7e85121f67007a774524167b1063cf1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e39d816de31fa1c42a254ec61527496e7e85121f67007a774524167b1063cf1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:06:47Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:47Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:47 crc kubenswrapper[4816]: I0316 00:08:47.709836 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://516852108b3d17a15676573ff080d3d94fa48d8076ed0404a8d827e715c2555e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:47Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:47 crc kubenswrapper[4816]: I0316 00:08:47.747594 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:47Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:47 crc kubenswrapper[4816]: I0316 00:08:47.772056 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-mt7bq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"03ef49f1-0c6a-443a-8df3-2db339c562ed\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d31459f93f54cc1cd99638b1f108b8997b34f246cf8c65e5462e672f04aaa7c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8979f751ec4797986924a3efea84d02bc1c1cf303e1e992f349c9083f2fe4f08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8979f751ec4797986924a3efea84d02bc1c1cf303e1e992f349c9083f2fe4f08\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa87691b0dc314209d88d1f4d5b2227abc1de527e8111831bb8243ba6192c300\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fa87691b0dc314209d88d1f4d5b2227abc1de527e8111831bb8243ba6192c300\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:08:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://33f82b993e3d729a5830e88136e74687c930d6ad1a15c64f003f311fda418bce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://33f82b993e3d729a5830e88136e74687c930d6ad1a15c64f003f311fda418bce\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:08:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b640c26797535db11051d50aefd13b7b2759cfa19d163f53764b0fe8710cefdc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b640c26797535db11051d50aefd13b7b2759cfa19d163f53764b0fe8710cefdc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:08:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe5359762a36b39c9f8ae939a1b9d094c42ffab59c4b9743a83b4391c171a512\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fe5359762a36b39c9f8ae939a1b9d094c42ffab59c4b9743a83b4391c171a512\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:08:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc31f057b776cf1925d265b64c7a9a9fb0993e4f3ffff530a48f28865e1c2954\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dc31f057b776cf1925d265b64c7a9a9fb0993e4f3ffff530a48f28865e1c2954\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:08:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:08:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-mt7bq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:47Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:47 crc kubenswrapper[4816]: E0316 00:08:47.784596 4816 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 16 00:08:47 crc kubenswrapper[4816]: I0316 00:08:47.798270 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-lhpbn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6ec6a8ee-efd9-45df-bb35-706fcc90ebe9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a34b33696ef85102800de34359b2f8b4bf11c03ca6347dde766456bcea20e0e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c2x8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:08:38Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-lhpbn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:47Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:47 crc kubenswrapper[4816]: I0316 00:08:47.819192 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3a9d51fa-9450-4f18-be16-0a93d64889b4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://94c8e0e7b0ecd4d0f525f55538bff0b01653544be87c923fc152d4d982da7021\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://02c81b3d94044c1f4f344dd8306cacf8db533a662535ddb0a6a4ffc1bb4cf1f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://37dea82cd38ef9cfbda3626b789b21af2ada4b4a58bc2ecc6ce55eda60052277\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e44b0bbe4cebe4e52726f3f654f8b4a2953abcd0ad32db752c4641b3f0be1b00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e44b0bbe4cebe4e52726f3f654f8b4a2953abcd0ad32db752c4641b3f0be1b00\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:48Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:06:47Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:47Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:47 crc kubenswrapper[4816]: I0316 00:08:47.840451 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:47Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:47 crc kubenswrapper[4816]: I0316 00:08:47.856614 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:47Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:47 crc kubenswrapper[4816]: I0316 00:08:47.872631 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-jqsjn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"84360ef9-0450-44c5-80eb-eab1bf8e808b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxldb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxldb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:08:45Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-jqsjn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:47Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:47 crc kubenswrapper[4816]: I0316 00:08:47.897087 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"49e35ef6-7a6d-438f-b598-4445d73db379\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fbe90b49c5d42bb9d6d5d1b8f1d02b571403dd3f39d3118081351d72c4910914\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://508e6dbe6f2f1392d16501e09ee2ec523a3c2a245cd96c6ab773e26e83b8bb6a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://15b5a3223ff0fb5b96e5b87ee836add66498165759f4d6ed8cb6413f091967e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://603bdcbd4a6af5fda7ca09e4823e0db8d7ec3e3eb38eddf9f062a14b433cb094\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8b061f16a65a27b9a5ef66231376b70fec084479a991b7fdd430957df76da32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b19c46be8bb0c6604bb4f702d14472b782280ce716a3b5fe87a4fa4f1b15b174\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b19c46be8bb0c6604bb4f702d14472b782280ce716a3b5fe87a4fa4f1b15b174\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://baceacd0418c0bbcbd2e875008286443b263336ba956d8e4584147ed949b3f69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://baceacd0418c0bbcbd2e875008286443b263336ba956d8e4584147ed949b3f69\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:49Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://24e7adc2ddce8a281cd270db47b3528d2a4965ab6d41a14ecb5aaea02daf0c45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://24e7adc2ddce8a281cd270db47b3528d2a4965ab6d41a14ecb5aaea02daf0c45\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:06:47Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:47Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:47 crc kubenswrapper[4816]: I0316 00:08:47.914322 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-szscw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e9789e58-12c8-4831-9401-af48a3e92209\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e6f0b7e8f95bbbf3b90133349b8b13d2784582a5d95dae8dd159328b570ae962\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mxf6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:08:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-szscw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:47Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:47 crc kubenswrapper[4816]: I0316 00:08:47.933817 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-jrdcz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dd08ece2-7636-4966-973a-e96a34b70b53\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6dd22d6548986bf126a8ce6b30c7c11d8c37bd0c03539a55cc28fb7debcfc26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trwbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7003a4592a48f87ab63e9f5637c7665040f9784a7c8f599c82ed61270ddb793c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trwbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:08:31Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-jrdcz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:47Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:47 crc kubenswrapper[4816]: I0316 00:08:47.953141 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-psjs7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f47b6c54d5d72d827208f4d8228be260d9d19b4a720b3174349288dee2916641\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e0a510814d60106574e3cd3e612c031ac900caa0f24a12a62d7cd1f6bfda1ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://826495d8ab8f414169bf36159278da0a040fec699a74b4aebdc8f87ccdc48bd6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa5f052d32ad8f52f33ee5baea5af98a64f05b831b86ebced6c9422fb4845fcf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://86c28f1ae2f2fb61b7755ac14bc7f5346036735c75daccaa0e31b4e606cfb4c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0fac0a986edf984b0e34eda0f969205d5bc92f23d0edacb3c6dd8721eaed2fb9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8dc5c9e05d9b292469e8d707f3727adbf33725ea339bcfefead2bbdbb094400\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f8dc5c9e05d9b292469e8d707f3727adbf33725ea339bcfefead2bbdbb094400\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-16T00:08:43Z\\\",\\\"message\\\":\\\"0:08:43.170302 6850 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0316 00:08:43.170323 6850 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0316 00:08:43.170342 6850 handler.go:208] Removed *v1.Node event handler 7\\\\nI0316 00:08:43.170346 6850 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0316 00:08:43.170361 6850 handler.go:208] Removed *v1.Node event handler 2\\\\nI0316 00:08:43.170363 6850 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0316 00:08:43.170364 6850 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0316 00:08:43.170402 6850 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0316 00:08:43.170450 6850 factory.go:656] Stopping watch factory\\\\nI0316 00:08:43.170454 6850 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0316 00:08:43.170464 6850 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0316 00:08:43.170464 6850 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0316 00:08:43.170503 6850 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0316 00:08:43.170666 6850 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:42Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-psjs7_openshift-ovn-kubernetes(2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4d6bf581280690c99ddbbecd4cf4e215a12230d377978aa48acd3c05b85a3c56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1249f8b1c01db98230c10bde49bdb68f17caccc1ba0546e462df4384ca1658dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1249f8b1c01db98230c10bde49bdb68f17caccc1ba0546e462df4384ca1658dd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:08:31Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-psjs7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:47Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:47 crc kubenswrapper[4816]: I0316 00:08:47.970525 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://723654bc91ce4d40a27b14a9c4df1ed6c8dff2517f378927d7a6a96aeaa6951b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:47Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:47 crc kubenswrapper[4816]: I0316 00:08:47.986382 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://37869baff974556c22aadadfa4b528b5d6b9ad236107b4dbc945b3cdbf743f20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f9fe727708fdaca68596b7d305629ade45b061a88a81085f885d490ab99e24b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:47Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:47 crc kubenswrapper[4816]: I0316 00:08:47.995302 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-cnhkf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e686cd4-bddf-463e-b471-e49ea862691e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f6c3cacd4d7f6866e66b7234b280540ef2c443531de7a11f6894323ef24a9fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bwzt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:08:31Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-cnhkf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:47Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:48 crc kubenswrapper[4816]: I0316 00:08:48.006988 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-kbgw5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7b28986d-e33b-4876-ab6d-64d69960fb8b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://544bab5013b513f755dd9fa88cbc71021b0cfe823aa7508af994c0b1342c3646\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zgr7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0beb1aba8b8813abd98809200fa4dd348377427596c2935ceb012361634df2b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zgr7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:08:44Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-kbgw5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:48Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:48 crc kubenswrapper[4816]: I0316 00:08:48.667300 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 16 00:08:48 crc kubenswrapper[4816]: I0316 00:08:48.667359 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jqsjn" Mar 16 00:08:48 crc kubenswrapper[4816]: E0316 00:08:48.667760 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jqsjn" podUID="84360ef9-0450-44c5-80eb-eab1bf8e808b" Mar 16 00:08:48 crc kubenswrapper[4816]: E0316 00:08:48.667914 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 16 00:08:48 crc kubenswrapper[4816]: I0316 00:08:48.668005 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 16 00:08:48 crc kubenswrapper[4816]: I0316 00:08:48.668061 4816 scope.go:117] "RemoveContainer" containerID="29c4319e41bd025417b11cb87b682a66698ba7575801f247b1192dc8067ab66f" Mar 16 00:08:48 crc kubenswrapper[4816]: I0316 00:08:48.668133 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 16 00:08:48 crc kubenswrapper[4816]: E0316 00:08:48.668227 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 16 00:08:48 crc kubenswrapper[4816]: E0316 00:08:48.668308 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 16 00:08:48 crc kubenswrapper[4816]: I0316 00:08:48.928440 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/84360ef9-0450-44c5-80eb-eab1bf8e808b-metrics-certs\") pod \"network-metrics-daemon-jqsjn\" (UID: \"84360ef9-0450-44c5-80eb-eab1bf8e808b\") " pod="openshift-multus/network-metrics-daemon-jqsjn" Mar 16 00:08:48 crc kubenswrapper[4816]: E0316 00:08:48.928605 4816 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 16 00:08:48 crc kubenswrapper[4816]: E0316 00:08:48.928667 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/84360ef9-0450-44c5-80eb-eab1bf8e808b-metrics-certs podName:84360ef9-0450-44c5-80eb-eab1bf8e808b nodeName:}" failed. No retries permitted until 2026-03-16 00:08:52.928648623 +0000 UTC m=+126.024948576 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/84360ef9-0450-44c5-80eb-eab1bf8e808b-metrics-certs") pod "network-metrics-daemon-jqsjn" (UID: "84360ef9-0450-44c5-80eb-eab1bf8e808b") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 16 00:08:49 crc kubenswrapper[4816]: I0316 00:08:49.240116 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:49 crc kubenswrapper[4816]: I0316 00:08:49.240230 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:49 crc kubenswrapper[4816]: I0316 00:08:49.240253 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:49 crc kubenswrapper[4816]: I0316 00:08:49.240277 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:49 crc kubenswrapper[4816]: I0316 00:08:49.240294 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:49Z","lastTransitionTime":"2026-03-16T00:08:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:49 crc kubenswrapper[4816]: E0316 00:08:49.262302 4816 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:08:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:49Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:08:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:49Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:08:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:49Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:08:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:49Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d8fb348c-4907-4c8f-859a-735976530e03\\\",\\\"systemUUID\\\":\\\"e97abf98-298f-4589-a70a-4cfb5cb2994a\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:49Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:49 crc kubenswrapper[4816]: I0316 00:08:49.267100 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:49 crc kubenswrapper[4816]: I0316 00:08:49.267168 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:49 crc kubenswrapper[4816]: I0316 00:08:49.267190 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:49 crc kubenswrapper[4816]: I0316 00:08:49.267217 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:49 crc kubenswrapper[4816]: I0316 00:08:49.267236 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:49Z","lastTransitionTime":"2026-03-16T00:08:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:49 crc kubenswrapper[4816]: I0316 00:08:49.276447 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Mar 16 00:08:49 crc kubenswrapper[4816]: I0316 00:08:49.279447 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"4e7f3b9b1fc1648908bbf44f9ad32e1eb510b68336a8b60c8417df2f622a36e5"} Mar 16 00:08:49 crc kubenswrapper[4816]: I0316 00:08:49.279970 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 16 00:08:49 crc kubenswrapper[4816]: I0316 00:08:49.303999 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3a9d51fa-9450-4f18-be16-0a93d64889b4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://94c8e0e7b0ecd4d0f525f55538bff0b01653544be87c923fc152d4d982da7021\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://02c81b3d94044c1f4f344dd8306cacf8db533a662535ddb0a6a4ffc1bb4cf1f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://37dea82cd38ef9cfbda3626b789b21af2ada4b4a58bc2ecc6ce55eda60052277\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e44b0bbe4cebe4e52726f3f654f8b4a2953abcd0ad32db752c4641b3f0be1b00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e44b0bbe4cebe4e52726f3f654f8b4a2953abcd0ad32db752c4641b3f0be1b00\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:48Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:06:47Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:49Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:49 crc kubenswrapper[4816]: E0316 00:08:49.306924 4816 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:08:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:49Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:08:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:49Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:08:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:49Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:08:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:49Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d8fb348c-4907-4c8f-859a-735976530e03\\\",\\\"systemUUID\\\":\\\"e97abf98-298f-4589-a70a-4cfb5cb2994a\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:49Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:49 crc kubenswrapper[4816]: I0316 00:08:49.312915 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:49 crc kubenswrapper[4816]: I0316 00:08:49.312988 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:49 crc kubenswrapper[4816]: I0316 00:08:49.313007 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:49 crc kubenswrapper[4816]: I0316 00:08:49.313034 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:49 crc kubenswrapper[4816]: I0316 00:08:49.313053 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:49Z","lastTransitionTime":"2026-03-16T00:08:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:49 crc kubenswrapper[4816]: I0316 00:08:49.321667 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:49Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:49 crc kubenswrapper[4816]: E0316 00:08:49.334520 4816 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:08:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:49Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:08:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:49Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:08:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:49Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:08:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:49Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d8fb348c-4907-4c8f-859a-735976530e03\\\",\\\"systemUUID\\\":\\\"e97abf98-298f-4589-a70a-4cfb5cb2994a\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:49Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:49 crc kubenswrapper[4816]: I0316 00:08:49.339237 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:49 crc kubenswrapper[4816]: I0316 00:08:49.339280 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:49 crc kubenswrapper[4816]: I0316 00:08:49.339297 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:49 crc kubenswrapper[4816]: I0316 00:08:49.339320 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:49 crc kubenswrapper[4816]: I0316 00:08:49.339339 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:49Z","lastTransitionTime":"2026-03-16T00:08:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:49 crc kubenswrapper[4816]: I0316 00:08:49.343687 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:49Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:49 crc kubenswrapper[4816]: E0316 00:08:49.354888 4816 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:08:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:49Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:08:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:49Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:08:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:49Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:08:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:49Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d8fb348c-4907-4c8f-859a-735976530e03\\\",\\\"systemUUID\\\":\\\"e97abf98-298f-4589-a70a-4cfb5cb2994a\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:49Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:49 crc kubenswrapper[4816]: I0316 00:08:49.359448 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:49 crc kubenswrapper[4816]: I0316 00:08:49.359488 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:49 crc kubenswrapper[4816]: I0316 00:08:49.359504 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:49 crc kubenswrapper[4816]: I0316 00:08:49.359526 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:49 crc kubenswrapper[4816]: I0316 00:08:49.359542 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:49Z","lastTransitionTime":"2026-03-16T00:08:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:49 crc kubenswrapper[4816]: I0316 00:08:49.359611 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-jqsjn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"84360ef9-0450-44c5-80eb-eab1bf8e808b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxldb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxldb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:08:45Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-jqsjn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:49Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:49 crc kubenswrapper[4816]: E0316 00:08:49.379200 4816 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:08:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:49Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:08:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:49Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:08:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:49Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:08:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:49Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d8fb348c-4907-4c8f-859a-735976530e03\\\",\\\"systemUUID\\\":\\\"e97abf98-298f-4589-a70a-4cfb5cb2994a\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:49Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:49 crc kubenswrapper[4816]: E0316 00:08:49.379451 4816 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 16 00:08:49 crc kubenswrapper[4816]: I0316 00:08:49.388190 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"49e35ef6-7a6d-438f-b598-4445d73db379\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fbe90b49c5d42bb9d6d5d1b8f1d02b571403dd3f39d3118081351d72c4910914\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://508e6dbe6f2f1392d16501e09ee2ec523a3c2a245cd96c6ab773e26e83b8bb6a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://15b5a3223ff0fb5b96e5b87ee836add66498165759f4d6ed8cb6413f091967e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://603bdcbd4a6af5fda7ca09e4823e0db8d7ec3e3eb38eddf9f062a14b433cb094\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8b061f16a65a27b9a5ef66231376b70fec084479a991b7fdd430957df76da32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b19c46be8bb0c6604bb4f702d14472b782280ce716a3b5fe87a4fa4f1b15b174\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b19c46be8bb0c6604bb4f702d14472b782280ce716a3b5fe87a4fa4f1b15b174\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://baceacd0418c0bbcbd2e875008286443b263336ba956d8e4584147ed949b3f69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://baceacd0418c0bbcbd2e875008286443b263336ba956d8e4584147ed949b3f69\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:49Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://24e7adc2ddce8a281cd270db47b3528d2a4965ab6d41a14ecb5aaea02daf0c45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://24e7adc2ddce8a281cd270db47b3528d2a4965ab6d41a14ecb5aaea02daf0c45\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:06:47Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:49Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:49 crc kubenswrapper[4816]: I0316 00:08:49.401587 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-szscw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e9789e58-12c8-4831-9401-af48a3e92209\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e6f0b7e8f95bbbf3b90133349b8b13d2784582a5d95dae8dd159328b570ae962\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mxf6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:08:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-szscw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:49Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:49 crc kubenswrapper[4816]: I0316 00:08:49.417945 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-jrdcz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dd08ece2-7636-4966-973a-e96a34b70b53\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6dd22d6548986bf126a8ce6b30c7c11d8c37bd0c03539a55cc28fb7debcfc26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trwbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7003a4592a48f87ab63e9f5637c7665040f9784a7c8f599c82ed61270ddb793c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trwbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:08:31Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-jrdcz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:49Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:49 crc kubenswrapper[4816]: I0316 00:08:49.440612 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-psjs7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f47b6c54d5d72d827208f4d8228be260d9d19b4a720b3174349288dee2916641\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e0a510814d60106574e3cd3e612c031ac900caa0f24a12a62d7cd1f6bfda1ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://826495d8ab8f414169bf36159278da0a040fec699a74b4aebdc8f87ccdc48bd6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa5f052d32ad8f52f33ee5baea5af98a64f05b831b86ebced6c9422fb4845fcf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://86c28f1ae2f2fb61b7755ac14bc7f5346036735c75daccaa0e31b4e606cfb4c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0fac0a986edf984b0e34eda0f969205d5bc92f23d0edacb3c6dd8721eaed2fb9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8dc5c9e05d9b292469e8d707f3727adbf33725ea339bcfefead2bbdbb094400\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f8dc5c9e05d9b292469e8d707f3727adbf33725ea339bcfefead2bbdbb094400\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-16T00:08:43Z\\\",\\\"message\\\":\\\"0:08:43.170302 6850 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0316 00:08:43.170323 6850 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0316 00:08:43.170342 6850 handler.go:208] Removed *v1.Node event handler 7\\\\nI0316 00:08:43.170346 6850 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0316 00:08:43.170361 6850 handler.go:208] Removed *v1.Node event handler 2\\\\nI0316 00:08:43.170363 6850 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0316 00:08:43.170364 6850 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0316 00:08:43.170402 6850 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0316 00:08:43.170450 6850 factory.go:656] Stopping watch factory\\\\nI0316 00:08:43.170454 6850 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0316 00:08:43.170464 6850 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0316 00:08:43.170464 6850 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0316 00:08:43.170503 6850 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0316 00:08:43.170666 6850 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:42Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-psjs7_openshift-ovn-kubernetes(2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4d6bf581280690c99ddbbecd4cf4e215a12230d377978aa48acd3c05b85a3c56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1249f8b1c01db98230c10bde49bdb68f17caccc1ba0546e462df4384ca1658dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1249f8b1c01db98230c10bde49bdb68f17caccc1ba0546e462df4384ca1658dd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:08:31Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-psjs7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:49Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:49 crc kubenswrapper[4816]: I0316 00:08:49.459294 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://723654bc91ce4d40a27b14a9c4df1ed6c8dff2517f378927d7a6a96aeaa6951b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:49Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:49 crc kubenswrapper[4816]: I0316 00:08:49.472281 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://37869baff974556c22aadadfa4b528b5d6b9ad236107b4dbc945b3cdbf743f20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f9fe727708fdaca68596b7d305629ade45b061a88a81085f885d490ab99e24b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:49Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:49 crc kubenswrapper[4816]: I0316 00:08:49.481770 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-cnhkf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e686cd4-bddf-463e-b471-e49ea862691e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f6c3cacd4d7f6866e66b7234b280540ef2c443531de7a11f6894323ef24a9fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bwzt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:08:31Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-cnhkf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:49Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:49 crc kubenswrapper[4816]: I0316 00:08:49.495770 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-kbgw5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7b28986d-e33b-4876-ab6d-64d69960fb8b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://544bab5013b513f755dd9fa88cbc71021b0cfe823aa7508af994c0b1342c3646\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zgr7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0beb1aba8b8813abd98809200fa4dd348377427596c2935ceb012361634df2b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zgr7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:08:44Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-kbgw5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:49Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:49 crc kubenswrapper[4816]: I0316 00:08:49.514487 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c8786d1-025d-4788-bafe-c2c2eaf8e398\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://46db23088e8ba791d736f70a65bd066af80a5ec27c31332d9ac9c52530b21bfd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://da7b257af12b9ee605dc32a26d9565f866a9cafebb4936a0bde7f42ae9909f80\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2523ea43f2490afb81354be8a2840f54a3be1a8d3154196e0c8ac365d09723a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e7f3b9b1fc1648908bbf44f9ad32e1eb510b68336a8b60c8417df2f622a36e5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://29c4319e41bd025417b11cb87b682a66698ba7575801f247b1192dc8067ab66f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-16T00:07:59Z\\\",\\\"message\\\":\\\"le observer\\\\nW0316 00:07:59.373922 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0316 00:07:59.374075 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0316 00:07:59.374860 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2696044728/tls.crt::/tmp/serving-cert-2696044728/tls.key\\\\\\\"\\\\nI0316 00:07:59.560928 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0316 00:07:59.563996 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0316 00:07:59.564013 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0316 00:07:59.564035 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0316 00:07:59.564041 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0316 00:07:59.571475 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0316 00:07:59.571523 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0316 00:07:59.571531 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0316 00:07:59.571540 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0316 00:07:59.571574 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0316 00:07:59.571588 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nI0316 00:07:59.571493 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0316 00:07:59.571597 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0316 00:07:59.573484 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-16T00:07:58Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c81d0c40be9c3912864280cd98481f837ef29f69d85599b39decf5e9d7a97eff\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:50Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e39d816de31fa1c42a254ec61527496e7e85121f67007a774524167b1063cf1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e39d816de31fa1c42a254ec61527496e7e85121f67007a774524167b1063cf1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:06:47Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:49Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:49 crc kubenswrapper[4816]: I0316 00:08:49.527669 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://516852108b3d17a15676573ff080d3d94fa48d8076ed0404a8d827e715c2555e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:49Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:49 crc kubenswrapper[4816]: I0316 00:08:49.541918 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:49Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:49 crc kubenswrapper[4816]: I0316 00:08:49.559493 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-mt7bq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"03ef49f1-0c6a-443a-8df3-2db339c562ed\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d31459f93f54cc1cd99638b1f108b8997b34f246cf8c65e5462e672f04aaa7c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8979f751ec4797986924a3efea84d02bc1c1cf303e1e992f349c9083f2fe4f08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8979f751ec4797986924a3efea84d02bc1c1cf303e1e992f349c9083f2fe4f08\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa87691b0dc314209d88d1f4d5b2227abc1de527e8111831bb8243ba6192c300\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fa87691b0dc314209d88d1f4d5b2227abc1de527e8111831bb8243ba6192c300\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:08:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://33f82b993e3d729a5830e88136e74687c930d6ad1a15c64f003f311fda418bce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://33f82b993e3d729a5830e88136e74687c930d6ad1a15c64f003f311fda418bce\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:08:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b640c26797535db11051d50aefd13b7b2759cfa19d163f53764b0fe8710cefdc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b640c26797535db11051d50aefd13b7b2759cfa19d163f53764b0fe8710cefdc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:08:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe5359762a36b39c9f8ae939a1b9d094c42ffab59c4b9743a83b4391c171a512\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fe5359762a36b39c9f8ae939a1b9d094c42ffab59c4b9743a83b4391c171a512\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:08:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc31f057b776cf1925d265b64c7a9a9fb0993e4f3ffff530a48f28865e1c2954\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dc31f057b776cf1925d265b64c7a9a9fb0993e4f3ffff530a48f28865e1c2954\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:08:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:08:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-mt7bq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:49Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:49 crc kubenswrapper[4816]: I0316 00:08:49.572643 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-lhpbn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6ec6a8ee-efd9-45df-bb35-706fcc90ebe9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a34b33696ef85102800de34359b2f8b4bf11c03ca6347dde766456bcea20e0e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c2x8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:08:38Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-lhpbn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:49Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:50 crc kubenswrapper[4816]: I0316 00:08:50.667436 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 16 00:08:50 crc kubenswrapper[4816]: I0316 00:08:50.667526 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 16 00:08:50 crc kubenswrapper[4816]: I0316 00:08:50.667466 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 16 00:08:50 crc kubenswrapper[4816]: I0316 00:08:50.667466 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jqsjn" Mar 16 00:08:50 crc kubenswrapper[4816]: E0316 00:08:50.667668 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 16 00:08:50 crc kubenswrapper[4816]: E0316 00:08:50.667787 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 16 00:08:50 crc kubenswrapper[4816]: E0316 00:08:50.667892 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jqsjn" podUID="84360ef9-0450-44c5-80eb-eab1bf8e808b" Mar 16 00:08:50 crc kubenswrapper[4816]: E0316 00:08:50.668003 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 16 00:08:52 crc kubenswrapper[4816]: I0316 00:08:52.666741 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 16 00:08:52 crc kubenswrapper[4816]: I0316 00:08:52.666764 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jqsjn" Mar 16 00:08:52 crc kubenswrapper[4816]: I0316 00:08:52.666805 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 16 00:08:52 crc kubenswrapper[4816]: E0316 00:08:52.667431 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jqsjn" podUID="84360ef9-0450-44c5-80eb-eab1bf8e808b" Mar 16 00:08:52 crc kubenswrapper[4816]: E0316 00:08:52.667233 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 16 00:08:52 crc kubenswrapper[4816]: I0316 00:08:52.666896 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 16 00:08:52 crc kubenswrapper[4816]: E0316 00:08:52.667661 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 16 00:08:52 crc kubenswrapper[4816]: E0316 00:08:52.667765 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 16 00:08:52 crc kubenswrapper[4816]: E0316 00:08:52.785732 4816 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 16 00:08:52 crc kubenswrapper[4816]: I0316 00:08:52.986618 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/84360ef9-0450-44c5-80eb-eab1bf8e808b-metrics-certs\") pod \"network-metrics-daemon-jqsjn\" (UID: \"84360ef9-0450-44c5-80eb-eab1bf8e808b\") " pod="openshift-multus/network-metrics-daemon-jqsjn" Mar 16 00:08:52 crc kubenswrapper[4816]: E0316 00:08:52.986895 4816 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 16 00:08:52 crc kubenswrapper[4816]: E0316 00:08:52.987001 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/84360ef9-0450-44c5-80eb-eab1bf8e808b-metrics-certs podName:84360ef9-0450-44c5-80eb-eab1bf8e808b nodeName:}" failed. No retries permitted until 2026-03-16 00:09:00.986974893 +0000 UTC m=+134.083274876 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/84360ef9-0450-44c5-80eb-eab1bf8e808b-metrics-certs") pod "network-metrics-daemon-jqsjn" (UID: "84360ef9-0450-44c5-80eb-eab1bf8e808b") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 16 00:08:54 crc kubenswrapper[4816]: I0316 00:08:54.666683 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jqsjn" Mar 16 00:08:54 crc kubenswrapper[4816]: I0316 00:08:54.666750 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 16 00:08:54 crc kubenswrapper[4816]: I0316 00:08:54.666699 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 16 00:08:54 crc kubenswrapper[4816]: I0316 00:08:54.666683 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 16 00:08:54 crc kubenswrapper[4816]: E0316 00:08:54.666882 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jqsjn" podUID="84360ef9-0450-44c5-80eb-eab1bf8e808b" Mar 16 00:08:54 crc kubenswrapper[4816]: E0316 00:08:54.666999 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 16 00:08:54 crc kubenswrapper[4816]: E0316 00:08:54.667204 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 16 00:08:54 crc kubenswrapper[4816]: E0316 00:08:54.667338 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 16 00:08:56 crc kubenswrapper[4816]: I0316 00:08:56.667597 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jqsjn" Mar 16 00:08:56 crc kubenswrapper[4816]: I0316 00:08:56.667723 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 16 00:08:56 crc kubenswrapper[4816]: E0316 00:08:56.668033 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jqsjn" podUID="84360ef9-0450-44c5-80eb-eab1bf8e808b" Mar 16 00:08:56 crc kubenswrapper[4816]: I0316 00:08:56.668085 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 16 00:08:56 crc kubenswrapper[4816]: I0316 00:08:56.668119 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 16 00:08:56 crc kubenswrapper[4816]: E0316 00:08:56.668257 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 16 00:08:56 crc kubenswrapper[4816]: E0316 00:08:56.668352 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 16 00:08:56 crc kubenswrapper[4816]: E0316 00:08:56.668486 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 16 00:08:56 crc kubenswrapper[4816]: I0316 00:08:56.669505 4816 scope.go:117] "RemoveContainer" containerID="f8dc5c9e05d9b292469e8d707f3727adbf33725ea339bcfefead2bbdbb094400" Mar 16 00:08:57 crc kubenswrapper[4816]: I0316 00:08:57.309474 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-psjs7_2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88/ovnkube-controller/1.log" Mar 16 00:08:57 crc kubenswrapper[4816]: I0316 00:08:57.311370 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-psjs7" event={"ID":"2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88","Type":"ContainerStarted","Data":"cf50ee7a67dc259915505ac9bbca0a709a016452caaeb0ffa14e5a5947db134e"} Mar 16 00:08:57 crc kubenswrapper[4816]: I0316 00:08:57.311727 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-psjs7" Mar 16 00:08:57 crc kubenswrapper[4816]: I0316 00:08:57.323571 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3a9d51fa-9450-4f18-be16-0a93d64889b4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://94c8e0e7b0ecd4d0f525f55538bff0b01653544be87c923fc152d4d982da7021\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://02c81b3d94044c1f4f344dd8306cacf8db533a662535ddb0a6a4ffc1bb4cf1f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://37dea82cd38ef9cfbda3626b789b21af2ada4b4a58bc2ecc6ce55eda60052277\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e44b0bbe4cebe4e52726f3f654f8b4a2953abcd0ad32db752c4641b3f0be1b00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e44b0bbe4cebe4e52726f3f654f8b4a2953abcd0ad32db752c4641b3f0be1b00\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:48Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:06:47Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:57Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:57 crc kubenswrapper[4816]: I0316 00:08:57.336472 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:57Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:57 crc kubenswrapper[4816]: I0316 00:08:57.352821 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:57Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:57 crc kubenswrapper[4816]: I0316 00:08:57.364950 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-jqsjn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"84360ef9-0450-44c5-80eb-eab1bf8e808b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxldb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxldb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:08:45Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-jqsjn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:57Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:57 crc kubenswrapper[4816]: I0316 00:08:57.385474 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"49e35ef6-7a6d-438f-b598-4445d73db379\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fbe90b49c5d42bb9d6d5d1b8f1d02b571403dd3f39d3118081351d72c4910914\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://508e6dbe6f2f1392d16501e09ee2ec523a3c2a245cd96c6ab773e26e83b8bb6a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://15b5a3223ff0fb5b96e5b87ee836add66498165759f4d6ed8cb6413f091967e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://603bdcbd4a6af5fda7ca09e4823e0db8d7ec3e3eb38eddf9f062a14b433cb094\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8b061f16a65a27b9a5ef66231376b70fec084479a991b7fdd430957df76da32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b19c46be8bb0c6604bb4f702d14472b782280ce716a3b5fe87a4fa4f1b15b174\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b19c46be8bb0c6604bb4f702d14472b782280ce716a3b5fe87a4fa4f1b15b174\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://baceacd0418c0bbcbd2e875008286443b263336ba956d8e4584147ed949b3f69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://baceacd0418c0bbcbd2e875008286443b263336ba956d8e4584147ed949b3f69\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:49Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://24e7adc2ddce8a281cd270db47b3528d2a4965ab6d41a14ecb5aaea02daf0c45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://24e7adc2ddce8a281cd270db47b3528d2a4965ab6d41a14ecb5aaea02daf0c45\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:06:47Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:57Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:57 crc kubenswrapper[4816]: I0316 00:08:57.397725 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-szscw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e9789e58-12c8-4831-9401-af48a3e92209\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e6f0b7e8f95bbbf3b90133349b8b13d2784582a5d95dae8dd159328b570ae962\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mxf6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:08:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-szscw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:57Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:57 crc kubenswrapper[4816]: I0316 00:08:57.409045 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-jrdcz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dd08ece2-7636-4966-973a-e96a34b70b53\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6dd22d6548986bf126a8ce6b30c7c11d8c37bd0c03539a55cc28fb7debcfc26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trwbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7003a4592a48f87ab63e9f5637c7665040f9784a7c8f599c82ed61270ddb793c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trwbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:08:31Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-jrdcz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:57Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:57 crc kubenswrapper[4816]: I0316 00:08:57.434576 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-psjs7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f47b6c54d5d72d827208f4d8228be260d9d19b4a720b3174349288dee2916641\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e0a510814d60106574e3cd3e612c031ac900caa0f24a12a62d7cd1f6bfda1ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://826495d8ab8f414169bf36159278da0a040fec699a74b4aebdc8f87ccdc48bd6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa5f052d32ad8f52f33ee5baea5af98a64f05b831b86ebced6c9422fb4845fcf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://86c28f1ae2f2fb61b7755ac14bc7f5346036735c75daccaa0e31b4e606cfb4c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0fac0a986edf984b0e34eda0f969205d5bc92f23d0edacb3c6dd8721eaed2fb9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf50ee7a67dc259915505ac9bbca0a709a016452caaeb0ffa14e5a5947db134e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f8dc5c9e05d9b292469e8d707f3727adbf33725ea339bcfefead2bbdbb094400\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-16T00:08:43Z\\\",\\\"message\\\":\\\"0:08:43.170302 6850 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0316 00:08:43.170323 6850 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0316 00:08:43.170342 6850 handler.go:208] Removed *v1.Node event handler 7\\\\nI0316 00:08:43.170346 6850 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0316 00:08:43.170361 6850 handler.go:208] Removed *v1.Node event handler 2\\\\nI0316 00:08:43.170363 6850 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0316 00:08:43.170364 6850 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0316 00:08:43.170402 6850 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0316 00:08:43.170450 6850 factory.go:656] Stopping watch factory\\\\nI0316 00:08:43.170454 6850 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0316 00:08:43.170464 6850 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0316 00:08:43.170464 6850 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0316 00:08:43.170503 6850 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0316 00:08:43.170666 6850 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:42Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4d6bf581280690c99ddbbecd4cf4e215a12230d377978aa48acd3c05b85a3c56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1249f8b1c01db98230c10bde49bdb68f17caccc1ba0546e462df4384ca1658dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1249f8b1c01db98230c10bde49bdb68f17caccc1ba0546e462df4384ca1658dd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:08:31Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-psjs7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:57Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:57 crc kubenswrapper[4816]: I0316 00:08:57.447569 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://723654bc91ce4d40a27b14a9c4df1ed6c8dff2517f378927d7a6a96aeaa6951b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:57Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:57 crc kubenswrapper[4816]: I0316 00:08:57.461471 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://37869baff974556c22aadadfa4b528b5d6b9ad236107b4dbc945b3cdbf743f20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f9fe727708fdaca68596b7d305629ade45b061a88a81085f885d490ab99e24b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:57Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:57 crc kubenswrapper[4816]: I0316 00:08:57.473616 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-cnhkf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e686cd4-bddf-463e-b471-e49ea862691e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f6c3cacd4d7f6866e66b7234b280540ef2c443531de7a11f6894323ef24a9fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bwzt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:08:31Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-cnhkf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:57Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:57 crc kubenswrapper[4816]: I0316 00:08:57.492893 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-kbgw5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7b28986d-e33b-4876-ab6d-64d69960fb8b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://544bab5013b513f755dd9fa88cbc71021b0cfe823aa7508af994c0b1342c3646\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zgr7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0beb1aba8b8813abd98809200fa4dd348377427596c2935ceb012361634df2b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zgr7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:08:44Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-kbgw5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:57Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:57 crc kubenswrapper[4816]: I0316 00:08:57.515432 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c8786d1-025d-4788-bafe-c2c2eaf8e398\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://46db23088e8ba791d736f70a65bd066af80a5ec27c31332d9ac9c52530b21bfd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://da7b257af12b9ee605dc32a26d9565f866a9cafebb4936a0bde7f42ae9909f80\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2523ea43f2490afb81354be8a2840f54a3be1a8d3154196e0c8ac365d09723a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e7f3b9b1fc1648908bbf44f9ad32e1eb510b68336a8b60c8417df2f622a36e5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://29c4319e41bd025417b11cb87b682a66698ba7575801f247b1192dc8067ab66f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-16T00:07:59Z\\\",\\\"message\\\":\\\"le observer\\\\nW0316 00:07:59.373922 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0316 00:07:59.374075 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0316 00:07:59.374860 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2696044728/tls.crt::/tmp/serving-cert-2696044728/tls.key\\\\\\\"\\\\nI0316 00:07:59.560928 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0316 00:07:59.563996 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0316 00:07:59.564013 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0316 00:07:59.564035 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0316 00:07:59.564041 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0316 00:07:59.571475 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0316 00:07:59.571523 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0316 00:07:59.571531 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0316 00:07:59.571540 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0316 00:07:59.571574 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0316 00:07:59.571588 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nI0316 00:07:59.571493 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0316 00:07:59.571597 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0316 00:07:59.573484 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-16T00:07:58Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c81d0c40be9c3912864280cd98481f837ef29f69d85599b39decf5e9d7a97eff\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:50Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e39d816de31fa1c42a254ec61527496e7e85121f67007a774524167b1063cf1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e39d816de31fa1c42a254ec61527496e7e85121f67007a774524167b1063cf1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:06:47Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:57Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:57 crc kubenswrapper[4816]: I0316 00:08:57.530908 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://516852108b3d17a15676573ff080d3d94fa48d8076ed0404a8d827e715c2555e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:57Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:57 crc kubenswrapper[4816]: I0316 00:08:57.547592 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:57Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:57 crc kubenswrapper[4816]: I0316 00:08:57.572628 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-mt7bq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"03ef49f1-0c6a-443a-8df3-2db339c562ed\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d31459f93f54cc1cd99638b1f108b8997b34f246cf8c65e5462e672f04aaa7c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8979f751ec4797986924a3efea84d02bc1c1cf303e1e992f349c9083f2fe4f08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8979f751ec4797986924a3efea84d02bc1c1cf303e1e992f349c9083f2fe4f08\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa87691b0dc314209d88d1f4d5b2227abc1de527e8111831bb8243ba6192c300\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fa87691b0dc314209d88d1f4d5b2227abc1de527e8111831bb8243ba6192c300\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:08:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://33f82b993e3d729a5830e88136e74687c930d6ad1a15c64f003f311fda418bce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://33f82b993e3d729a5830e88136e74687c930d6ad1a15c64f003f311fda418bce\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:08:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b640c26797535db11051d50aefd13b7b2759cfa19d163f53764b0fe8710cefdc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b640c26797535db11051d50aefd13b7b2759cfa19d163f53764b0fe8710cefdc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:08:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe5359762a36b39c9f8ae939a1b9d094c42ffab59c4b9743a83b4391c171a512\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fe5359762a36b39c9f8ae939a1b9d094c42ffab59c4b9743a83b4391c171a512\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:08:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc31f057b776cf1925d265b64c7a9a9fb0993e4f3ffff530a48f28865e1c2954\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dc31f057b776cf1925d265b64c7a9a9fb0993e4f3ffff530a48f28865e1c2954\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:08:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:08:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-mt7bq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:57Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:57 crc kubenswrapper[4816]: I0316 00:08:57.586865 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-lhpbn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6ec6a8ee-efd9-45df-bb35-706fcc90ebe9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a34b33696ef85102800de34359b2f8b4bf11c03ca6347dde766456bcea20e0e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c2x8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:08:38Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-lhpbn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:57Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:57 crc kubenswrapper[4816]: I0316 00:08:57.691513 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://723654bc91ce4d40a27b14a9c4df1ed6c8dff2517f378927d7a6a96aeaa6951b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:57Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:57 crc kubenswrapper[4816]: I0316 00:08:57.714948 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://37869baff974556c22aadadfa4b528b5d6b9ad236107b4dbc945b3cdbf743f20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f9fe727708fdaca68596b7d305629ade45b061a88a81085f885d490ab99e24b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:57Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:57 crc kubenswrapper[4816]: I0316 00:08:57.730842 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-cnhkf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e686cd4-bddf-463e-b471-e49ea862691e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f6c3cacd4d7f6866e66b7234b280540ef2c443531de7a11f6894323ef24a9fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bwzt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:08:31Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-cnhkf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:57Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:57 crc kubenswrapper[4816]: I0316 00:08:57.747358 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-kbgw5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7b28986d-e33b-4876-ab6d-64d69960fb8b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://544bab5013b513f755dd9fa88cbc71021b0cfe823aa7508af994c0b1342c3646\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zgr7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0beb1aba8b8813abd98809200fa4dd348377427596c2935ceb012361634df2b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zgr7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:08:44Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-kbgw5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:57Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:57 crc kubenswrapper[4816]: I0316 00:08:57.767935 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c8786d1-025d-4788-bafe-c2c2eaf8e398\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://46db23088e8ba791d736f70a65bd066af80a5ec27c31332d9ac9c52530b21bfd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://da7b257af12b9ee605dc32a26d9565f866a9cafebb4936a0bde7f42ae9909f80\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2523ea43f2490afb81354be8a2840f54a3be1a8d3154196e0c8ac365d09723a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e7f3b9b1fc1648908bbf44f9ad32e1eb510b68336a8b60c8417df2f622a36e5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://29c4319e41bd025417b11cb87b682a66698ba7575801f247b1192dc8067ab66f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-16T00:07:59Z\\\",\\\"message\\\":\\\"le observer\\\\nW0316 00:07:59.373922 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0316 00:07:59.374075 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0316 00:07:59.374860 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2696044728/tls.crt::/tmp/serving-cert-2696044728/tls.key\\\\\\\"\\\\nI0316 00:07:59.560928 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0316 00:07:59.563996 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0316 00:07:59.564013 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0316 00:07:59.564035 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0316 00:07:59.564041 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0316 00:07:59.571475 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0316 00:07:59.571523 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0316 00:07:59.571531 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0316 00:07:59.571540 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0316 00:07:59.571574 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0316 00:07:59.571588 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nI0316 00:07:59.571493 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0316 00:07:59.571597 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0316 00:07:59.573484 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-16T00:07:58Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c81d0c40be9c3912864280cd98481f837ef29f69d85599b39decf5e9d7a97eff\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:50Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e39d816de31fa1c42a254ec61527496e7e85121f67007a774524167b1063cf1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e39d816de31fa1c42a254ec61527496e7e85121f67007a774524167b1063cf1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:06:47Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:57Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:57 crc kubenswrapper[4816]: I0316 00:08:57.786493 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://516852108b3d17a15676573ff080d3d94fa48d8076ed0404a8d827e715c2555e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:57Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:57 crc kubenswrapper[4816]: E0316 00:08:57.786625 4816 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 16 00:08:57 crc kubenswrapper[4816]: I0316 00:08:57.804616 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:57Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:57 crc kubenswrapper[4816]: I0316 00:08:57.828672 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-mt7bq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"03ef49f1-0c6a-443a-8df3-2db339c562ed\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d31459f93f54cc1cd99638b1f108b8997b34f246cf8c65e5462e672f04aaa7c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8979f751ec4797986924a3efea84d02bc1c1cf303e1e992f349c9083f2fe4f08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8979f751ec4797986924a3efea84d02bc1c1cf303e1e992f349c9083f2fe4f08\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa87691b0dc314209d88d1f4d5b2227abc1de527e8111831bb8243ba6192c300\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fa87691b0dc314209d88d1f4d5b2227abc1de527e8111831bb8243ba6192c300\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:08:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://33f82b993e3d729a5830e88136e74687c930d6ad1a15c64f003f311fda418bce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://33f82b993e3d729a5830e88136e74687c930d6ad1a15c64f003f311fda418bce\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:08:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b640c26797535db11051d50aefd13b7b2759cfa19d163f53764b0fe8710cefdc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b640c26797535db11051d50aefd13b7b2759cfa19d163f53764b0fe8710cefdc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:08:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe5359762a36b39c9f8ae939a1b9d094c42ffab59c4b9743a83b4391c171a512\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fe5359762a36b39c9f8ae939a1b9d094c42ffab59c4b9743a83b4391c171a512\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:08:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc31f057b776cf1925d265b64c7a9a9fb0993e4f3ffff530a48f28865e1c2954\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dc31f057b776cf1925d265b64c7a9a9fb0993e4f3ffff530a48f28865e1c2954\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:08:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:08:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-mt7bq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:57Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:57 crc kubenswrapper[4816]: I0316 00:08:57.846639 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-lhpbn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6ec6a8ee-efd9-45df-bb35-706fcc90ebe9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a34b33696ef85102800de34359b2f8b4bf11c03ca6347dde766456bcea20e0e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c2x8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:08:38Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-lhpbn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:57Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:57 crc kubenswrapper[4816]: I0316 00:08:57.865095 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3a9d51fa-9450-4f18-be16-0a93d64889b4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://94c8e0e7b0ecd4d0f525f55538bff0b01653544be87c923fc152d4d982da7021\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://02c81b3d94044c1f4f344dd8306cacf8db533a662535ddb0a6a4ffc1bb4cf1f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://37dea82cd38ef9cfbda3626b789b21af2ada4b4a58bc2ecc6ce55eda60052277\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e44b0bbe4cebe4e52726f3f654f8b4a2953abcd0ad32db752c4641b3f0be1b00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e44b0bbe4cebe4e52726f3f654f8b4a2953abcd0ad32db752c4641b3f0be1b00\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:48Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:06:47Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:57Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:57 crc kubenswrapper[4816]: I0316 00:08:57.885007 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:57Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:57 crc kubenswrapper[4816]: I0316 00:08:57.905349 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:57Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:57 crc kubenswrapper[4816]: I0316 00:08:57.920932 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-jqsjn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"84360ef9-0450-44c5-80eb-eab1bf8e808b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxldb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxldb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:08:45Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-jqsjn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:57Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:57 crc kubenswrapper[4816]: I0316 00:08:57.949572 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"49e35ef6-7a6d-438f-b598-4445d73db379\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fbe90b49c5d42bb9d6d5d1b8f1d02b571403dd3f39d3118081351d72c4910914\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://508e6dbe6f2f1392d16501e09ee2ec523a3c2a245cd96c6ab773e26e83b8bb6a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://15b5a3223ff0fb5b96e5b87ee836add66498165759f4d6ed8cb6413f091967e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://603bdcbd4a6af5fda7ca09e4823e0db8d7ec3e3eb38eddf9f062a14b433cb094\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8b061f16a65a27b9a5ef66231376b70fec084479a991b7fdd430957df76da32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b19c46be8bb0c6604bb4f702d14472b782280ce716a3b5fe87a4fa4f1b15b174\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b19c46be8bb0c6604bb4f702d14472b782280ce716a3b5fe87a4fa4f1b15b174\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://baceacd0418c0bbcbd2e875008286443b263336ba956d8e4584147ed949b3f69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://baceacd0418c0bbcbd2e875008286443b263336ba956d8e4584147ed949b3f69\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:49Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://24e7adc2ddce8a281cd270db47b3528d2a4965ab6d41a14ecb5aaea02daf0c45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://24e7adc2ddce8a281cd270db47b3528d2a4965ab6d41a14ecb5aaea02daf0c45\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:06:47Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:57Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:57 crc kubenswrapper[4816]: I0316 00:08:57.969708 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-szscw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e9789e58-12c8-4831-9401-af48a3e92209\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e6f0b7e8f95bbbf3b90133349b8b13d2784582a5d95dae8dd159328b570ae962\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mxf6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:08:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-szscw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:57Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:57 crc kubenswrapper[4816]: I0316 00:08:57.986591 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-jrdcz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dd08ece2-7636-4966-973a-e96a34b70b53\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6dd22d6548986bf126a8ce6b30c7c11d8c37bd0c03539a55cc28fb7debcfc26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trwbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7003a4592a48f87ab63e9f5637c7665040f9784a7c8f599c82ed61270ddb793c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trwbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:08:31Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-jrdcz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:57Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:58 crc kubenswrapper[4816]: I0316 00:08:58.017407 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-psjs7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f47b6c54d5d72d827208f4d8228be260d9d19b4a720b3174349288dee2916641\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e0a510814d60106574e3cd3e612c031ac900caa0f24a12a62d7cd1f6bfda1ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://826495d8ab8f414169bf36159278da0a040fec699a74b4aebdc8f87ccdc48bd6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa5f052d32ad8f52f33ee5baea5af98a64f05b831b86ebced6c9422fb4845fcf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://86c28f1ae2f2fb61b7755ac14bc7f5346036735c75daccaa0e31b4e606cfb4c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0fac0a986edf984b0e34eda0f969205d5bc92f23d0edacb3c6dd8721eaed2fb9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf50ee7a67dc259915505ac9bbca0a709a016452caaeb0ffa14e5a5947db134e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f8dc5c9e05d9b292469e8d707f3727adbf33725ea339bcfefead2bbdbb094400\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-16T00:08:43Z\\\",\\\"message\\\":\\\"0:08:43.170302 6850 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0316 00:08:43.170323 6850 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0316 00:08:43.170342 6850 handler.go:208] Removed *v1.Node event handler 7\\\\nI0316 00:08:43.170346 6850 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0316 00:08:43.170361 6850 handler.go:208] Removed *v1.Node event handler 2\\\\nI0316 00:08:43.170363 6850 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0316 00:08:43.170364 6850 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0316 00:08:43.170402 6850 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0316 00:08:43.170450 6850 factory.go:656] Stopping watch factory\\\\nI0316 00:08:43.170454 6850 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0316 00:08:43.170464 6850 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0316 00:08:43.170464 6850 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0316 00:08:43.170503 6850 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0316 00:08:43.170666 6850 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:42Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4d6bf581280690c99ddbbecd4cf4e215a12230d377978aa48acd3c05b85a3c56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1249f8b1c01db98230c10bde49bdb68f17caccc1ba0546e462df4384ca1658dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1249f8b1c01db98230c10bde49bdb68f17caccc1ba0546e462df4384ca1658dd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:08:31Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-psjs7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:58Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:58 crc kubenswrapper[4816]: I0316 00:08:58.317332 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-psjs7_2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88/ovnkube-controller/2.log" Mar 16 00:08:58 crc kubenswrapper[4816]: I0316 00:08:58.318326 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-psjs7_2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88/ovnkube-controller/1.log" Mar 16 00:08:58 crc kubenswrapper[4816]: I0316 00:08:58.322945 4816 generic.go:334] "Generic (PLEG): container finished" podID="2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88" containerID="cf50ee7a67dc259915505ac9bbca0a709a016452caaeb0ffa14e5a5947db134e" exitCode=1 Mar 16 00:08:58 crc kubenswrapper[4816]: I0316 00:08:58.323016 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-psjs7" event={"ID":"2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88","Type":"ContainerDied","Data":"cf50ee7a67dc259915505ac9bbca0a709a016452caaeb0ffa14e5a5947db134e"} Mar 16 00:08:58 crc kubenswrapper[4816]: I0316 00:08:58.323070 4816 scope.go:117] "RemoveContainer" containerID="f8dc5c9e05d9b292469e8d707f3727adbf33725ea339bcfefead2bbdbb094400" Mar 16 00:08:58 crc kubenswrapper[4816]: I0316 00:08:58.324108 4816 scope.go:117] "RemoveContainer" containerID="cf50ee7a67dc259915505ac9bbca0a709a016452caaeb0ffa14e5a5947db134e" Mar 16 00:08:58 crc kubenswrapper[4816]: E0316 00:08:58.324428 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-psjs7_openshift-ovn-kubernetes(2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88)\"" pod="openshift-ovn-kubernetes/ovnkube-node-psjs7" podUID="2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88" Mar 16 00:08:58 crc kubenswrapper[4816]: I0316 00:08:58.358169 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:58Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:58 crc kubenswrapper[4816]: I0316 00:08:58.382032 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-mt7bq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"03ef49f1-0c6a-443a-8df3-2db339c562ed\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d31459f93f54cc1cd99638b1f108b8997b34f246cf8c65e5462e672f04aaa7c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8979f751ec4797986924a3efea84d02bc1c1cf303e1e992f349c9083f2fe4f08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8979f751ec4797986924a3efea84d02bc1c1cf303e1e992f349c9083f2fe4f08\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa87691b0dc314209d88d1f4d5b2227abc1de527e8111831bb8243ba6192c300\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fa87691b0dc314209d88d1f4d5b2227abc1de527e8111831bb8243ba6192c300\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:08:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://33f82b993e3d729a5830e88136e74687c930d6ad1a15c64f003f311fda418bce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://33f82b993e3d729a5830e88136e74687c930d6ad1a15c64f003f311fda418bce\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:08:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b640c26797535db11051d50aefd13b7b2759cfa19d163f53764b0fe8710cefdc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b640c26797535db11051d50aefd13b7b2759cfa19d163f53764b0fe8710cefdc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:08:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe5359762a36b39c9f8ae939a1b9d094c42ffab59c4b9743a83b4391c171a512\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fe5359762a36b39c9f8ae939a1b9d094c42ffab59c4b9743a83b4391c171a512\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:08:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc31f057b776cf1925d265b64c7a9a9fb0993e4f3ffff530a48f28865e1c2954\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dc31f057b776cf1925d265b64c7a9a9fb0993e4f3ffff530a48f28865e1c2954\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:08:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:08:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-mt7bq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:58Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:58 crc kubenswrapper[4816]: I0316 00:08:58.398014 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-lhpbn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6ec6a8ee-efd9-45df-bb35-706fcc90ebe9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a34b33696ef85102800de34359b2f8b4bf11c03ca6347dde766456bcea20e0e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c2x8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:08:38Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-lhpbn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:58Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:58 crc kubenswrapper[4816]: I0316 00:08:58.418958 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c8786d1-025d-4788-bafe-c2c2eaf8e398\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://46db23088e8ba791d736f70a65bd066af80a5ec27c31332d9ac9c52530b21bfd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://da7b257af12b9ee605dc32a26d9565f866a9cafebb4936a0bde7f42ae9909f80\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2523ea43f2490afb81354be8a2840f54a3be1a8d3154196e0c8ac365d09723a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e7f3b9b1fc1648908bbf44f9ad32e1eb510b68336a8b60c8417df2f622a36e5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://29c4319e41bd025417b11cb87b682a66698ba7575801f247b1192dc8067ab66f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-16T00:07:59Z\\\",\\\"message\\\":\\\"le observer\\\\nW0316 00:07:59.373922 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0316 00:07:59.374075 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0316 00:07:59.374860 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2696044728/tls.crt::/tmp/serving-cert-2696044728/tls.key\\\\\\\"\\\\nI0316 00:07:59.560928 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0316 00:07:59.563996 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0316 00:07:59.564013 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0316 00:07:59.564035 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0316 00:07:59.564041 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0316 00:07:59.571475 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0316 00:07:59.571523 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0316 00:07:59.571531 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0316 00:07:59.571540 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0316 00:07:59.571574 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0316 00:07:59.571588 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nI0316 00:07:59.571493 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0316 00:07:59.571597 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0316 00:07:59.573484 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-16T00:07:58Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c81d0c40be9c3912864280cd98481f837ef29f69d85599b39decf5e9d7a97eff\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:50Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e39d816de31fa1c42a254ec61527496e7e85121f67007a774524167b1063cf1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e39d816de31fa1c42a254ec61527496e7e85121f67007a774524167b1063cf1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:06:47Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:58Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:58 crc kubenswrapper[4816]: I0316 00:08:58.438221 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://516852108b3d17a15676573ff080d3d94fa48d8076ed0404a8d827e715c2555e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:58Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:58 crc kubenswrapper[4816]: I0316 00:08:58.459317 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:58Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:58 crc kubenswrapper[4816]: I0316 00:08:58.479796 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:58Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:58 crc kubenswrapper[4816]: I0316 00:08:58.495762 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-jqsjn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"84360ef9-0450-44c5-80eb-eab1bf8e808b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxldb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxldb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:08:45Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-jqsjn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:58Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:58 crc kubenswrapper[4816]: I0316 00:08:58.514802 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3a9d51fa-9450-4f18-be16-0a93d64889b4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://94c8e0e7b0ecd4d0f525f55538bff0b01653544be87c923fc152d4d982da7021\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://02c81b3d94044c1f4f344dd8306cacf8db533a662535ddb0a6a4ffc1bb4cf1f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://37dea82cd38ef9cfbda3626b789b21af2ada4b4a58bc2ecc6ce55eda60052277\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e44b0bbe4cebe4e52726f3f654f8b4a2953abcd0ad32db752c4641b3f0be1b00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e44b0bbe4cebe4e52726f3f654f8b4a2953abcd0ad32db752c4641b3f0be1b00\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:48Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:06:47Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:58Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:58 crc kubenswrapper[4816]: I0316 00:08:58.531750 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-jrdcz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dd08ece2-7636-4966-973a-e96a34b70b53\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6dd22d6548986bf126a8ce6b30c7c11d8c37bd0c03539a55cc28fb7debcfc26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trwbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7003a4592a48f87ab63e9f5637c7665040f9784a7c8f599c82ed61270ddb793c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trwbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:08:31Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-jrdcz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:58Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:58 crc kubenswrapper[4816]: I0316 00:08:58.564491 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-psjs7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f47b6c54d5d72d827208f4d8228be260d9d19b4a720b3174349288dee2916641\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e0a510814d60106574e3cd3e612c031ac900caa0f24a12a62d7cd1f6bfda1ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://826495d8ab8f414169bf36159278da0a040fec699a74b4aebdc8f87ccdc48bd6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa5f052d32ad8f52f33ee5baea5af98a64f05b831b86ebced6c9422fb4845fcf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://86c28f1ae2f2fb61b7755ac14bc7f5346036735c75daccaa0e31b4e606cfb4c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0fac0a986edf984b0e34eda0f969205d5bc92f23d0edacb3c6dd8721eaed2fb9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf50ee7a67dc259915505ac9bbca0a709a016452caaeb0ffa14e5a5947db134e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f8dc5c9e05d9b292469e8d707f3727adbf33725ea339bcfefead2bbdbb094400\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-16T00:08:43Z\\\",\\\"message\\\":\\\"0:08:43.170302 6850 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0316 00:08:43.170323 6850 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0316 00:08:43.170342 6850 handler.go:208] Removed *v1.Node event handler 7\\\\nI0316 00:08:43.170346 6850 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0316 00:08:43.170361 6850 handler.go:208] Removed *v1.Node event handler 2\\\\nI0316 00:08:43.170363 6850 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0316 00:08:43.170364 6850 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0316 00:08:43.170402 6850 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0316 00:08:43.170450 6850 factory.go:656] Stopping watch factory\\\\nI0316 00:08:43.170454 6850 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0316 00:08:43.170464 6850 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0316 00:08:43.170464 6850 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0316 00:08:43.170503 6850 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0316 00:08:43.170666 6850 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:42Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cf50ee7a67dc259915505ac9bbca0a709a016452caaeb0ffa14e5a5947db134e\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-16T00:08:57Z\\\",\\\"message\\\":\\\"} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:57Z is after 2025-08-24T17:21:41Z]\\\\nI0316 00:08:57.626466 7110 services_controller.go:356] Processing sync for service openshift-machine-config-operator/machine-config-controller for network=default\\\\nI0316 00:08:57.626473 7110 services_controller.go:473] Services do not match for network=default, existing lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-kube-storage-version-migrator-operator/metrics_TCP_cluster\\\\\\\", UUID:\\\\\\\"7f9b8f25-db1a-4d02-a423-9afc5c2fb83c\\\\\\\", Protocol:\\\\\\\"tcp\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-kube-storage-version-migrator-operator/metrics\\\\\\\"}, Opts:services.LBOpts{Reject:false, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}},\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4d6bf581280690c99ddbbecd4cf4e215a12230d377978aa48acd3c05b85a3c56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1249f8b1c01db98230c10bde49bdb68f17caccc1ba0546e462df4384ca1658dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1249f8b1c01db98230c10bde49bdb68f17caccc1ba0546e462df4384ca1658dd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:08:31Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-psjs7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:58Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:58 crc kubenswrapper[4816]: I0316 00:08:58.598641 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"49e35ef6-7a6d-438f-b598-4445d73db379\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fbe90b49c5d42bb9d6d5d1b8f1d02b571403dd3f39d3118081351d72c4910914\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://508e6dbe6f2f1392d16501e09ee2ec523a3c2a245cd96c6ab773e26e83b8bb6a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://15b5a3223ff0fb5b96e5b87ee836add66498165759f4d6ed8cb6413f091967e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://603bdcbd4a6af5fda7ca09e4823e0db8d7ec3e3eb38eddf9f062a14b433cb094\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8b061f16a65a27b9a5ef66231376b70fec084479a991b7fdd430957df76da32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b19c46be8bb0c6604bb4f702d14472b782280ce716a3b5fe87a4fa4f1b15b174\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b19c46be8bb0c6604bb4f702d14472b782280ce716a3b5fe87a4fa4f1b15b174\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://baceacd0418c0bbcbd2e875008286443b263336ba956d8e4584147ed949b3f69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://baceacd0418c0bbcbd2e875008286443b263336ba956d8e4584147ed949b3f69\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:49Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://24e7adc2ddce8a281cd270db47b3528d2a4965ab6d41a14ecb5aaea02daf0c45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://24e7adc2ddce8a281cd270db47b3528d2a4965ab6d41a14ecb5aaea02daf0c45\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:06:47Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:58Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:58 crc kubenswrapper[4816]: I0316 00:08:58.620229 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-szscw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e9789e58-12c8-4831-9401-af48a3e92209\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e6f0b7e8f95bbbf3b90133349b8b13d2784582a5d95dae8dd159328b570ae962\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mxf6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:08:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-szscw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:58Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:58 crc kubenswrapper[4816]: I0316 00:08:58.640091 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://37869baff974556c22aadadfa4b528b5d6b9ad236107b4dbc945b3cdbf743f20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f9fe727708fdaca68596b7d305629ade45b061a88a81085f885d490ab99e24b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:58Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:58 crc kubenswrapper[4816]: I0316 00:08:58.656632 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-cnhkf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e686cd4-bddf-463e-b471-e49ea862691e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f6c3cacd4d7f6866e66b7234b280540ef2c443531de7a11f6894323ef24a9fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bwzt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:08:31Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-cnhkf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:58Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:58 crc kubenswrapper[4816]: I0316 00:08:58.667338 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 16 00:08:58 crc kubenswrapper[4816]: I0316 00:08:58.667445 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 16 00:08:58 crc kubenswrapper[4816]: E0316 00:08:58.667843 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 16 00:08:58 crc kubenswrapper[4816]: I0316 00:08:58.667912 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 16 00:08:58 crc kubenswrapper[4816]: I0316 00:08:58.668007 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jqsjn" Mar 16 00:08:58 crc kubenswrapper[4816]: E0316 00:08:58.668183 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 16 00:08:58 crc kubenswrapper[4816]: E0316 00:08:58.668388 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 16 00:08:58 crc kubenswrapper[4816]: E0316 00:08:58.668609 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jqsjn" podUID="84360ef9-0450-44c5-80eb-eab1bf8e808b" Mar 16 00:08:58 crc kubenswrapper[4816]: I0316 00:08:58.681512 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-kbgw5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7b28986d-e33b-4876-ab6d-64d69960fb8b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://544bab5013b513f755dd9fa88cbc71021b0cfe823aa7508af994c0b1342c3646\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zgr7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0beb1aba8b8813abd98809200fa4dd348377427596c2935ceb012361634df2b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zgr7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:08:44Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-kbgw5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:58Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:58 crc kubenswrapper[4816]: I0316 00:08:58.704036 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://723654bc91ce4d40a27b14a9c4df1ed6c8dff2517f378927d7a6a96aeaa6951b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:58Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:59 crc kubenswrapper[4816]: I0316 00:08:59.330546 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-psjs7_2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88/ovnkube-controller/2.log" Mar 16 00:08:59 crc kubenswrapper[4816]: I0316 00:08:59.338168 4816 scope.go:117] "RemoveContainer" containerID="cf50ee7a67dc259915505ac9bbca0a709a016452caaeb0ffa14e5a5947db134e" Mar 16 00:08:59 crc kubenswrapper[4816]: E0316 00:08:59.338630 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-psjs7_openshift-ovn-kubernetes(2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88)\"" pod="openshift-ovn-kubernetes/ovnkube-node-psjs7" podUID="2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88" Mar 16 00:08:59 crc kubenswrapper[4816]: I0316 00:08:59.358133 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3a9d51fa-9450-4f18-be16-0a93d64889b4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://94c8e0e7b0ecd4d0f525f55538bff0b01653544be87c923fc152d4d982da7021\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://02c81b3d94044c1f4f344dd8306cacf8db533a662535ddb0a6a4ffc1bb4cf1f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://37dea82cd38ef9cfbda3626b789b21af2ada4b4a58bc2ecc6ce55eda60052277\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e44b0bbe4cebe4e52726f3f654f8b4a2953abcd0ad32db752c4641b3f0be1b00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e44b0bbe4cebe4e52726f3f654f8b4a2953abcd0ad32db752c4641b3f0be1b00\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:48Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:06:47Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:59Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:59 crc kubenswrapper[4816]: I0316 00:08:59.378728 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:59Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:59 crc kubenswrapper[4816]: I0316 00:08:59.397776 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:59Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:59 crc kubenswrapper[4816]: I0316 00:08:59.412425 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-jqsjn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"84360ef9-0450-44c5-80eb-eab1bf8e808b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxldb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxldb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:08:45Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-jqsjn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:59Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:59 crc kubenswrapper[4816]: I0316 00:08:59.451386 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"49e35ef6-7a6d-438f-b598-4445d73db379\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fbe90b49c5d42bb9d6d5d1b8f1d02b571403dd3f39d3118081351d72c4910914\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://508e6dbe6f2f1392d16501e09ee2ec523a3c2a245cd96c6ab773e26e83b8bb6a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://15b5a3223ff0fb5b96e5b87ee836add66498165759f4d6ed8cb6413f091967e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://603bdcbd4a6af5fda7ca09e4823e0db8d7ec3e3eb38eddf9f062a14b433cb094\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8b061f16a65a27b9a5ef66231376b70fec084479a991b7fdd430957df76da32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b19c46be8bb0c6604bb4f702d14472b782280ce716a3b5fe87a4fa4f1b15b174\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b19c46be8bb0c6604bb4f702d14472b782280ce716a3b5fe87a4fa4f1b15b174\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://baceacd0418c0bbcbd2e875008286443b263336ba956d8e4584147ed949b3f69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://baceacd0418c0bbcbd2e875008286443b263336ba956d8e4584147ed949b3f69\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:49Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://24e7adc2ddce8a281cd270db47b3528d2a4965ab6d41a14ecb5aaea02daf0c45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://24e7adc2ddce8a281cd270db47b3528d2a4965ab6d41a14ecb5aaea02daf0c45\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:06:47Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:59Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:59 crc kubenswrapper[4816]: I0316 00:08:59.474800 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-szscw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e9789e58-12c8-4831-9401-af48a3e92209\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e6f0b7e8f95bbbf3b90133349b8b13d2784582a5d95dae8dd159328b570ae962\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mxf6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:08:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-szscw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:59Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:59 crc kubenswrapper[4816]: I0316 00:08:59.490143 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-jrdcz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dd08ece2-7636-4966-973a-e96a34b70b53\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6dd22d6548986bf126a8ce6b30c7c11d8c37bd0c03539a55cc28fb7debcfc26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trwbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7003a4592a48f87ab63e9f5637c7665040f9784a7c8f599c82ed61270ddb793c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trwbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:08:31Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-jrdcz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:59Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:59 crc kubenswrapper[4816]: I0316 00:08:59.493873 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:59 crc kubenswrapper[4816]: I0316 00:08:59.493916 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:59 crc kubenswrapper[4816]: I0316 00:08:59.493933 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:59 crc kubenswrapper[4816]: I0316 00:08:59.493957 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:59 crc kubenswrapper[4816]: I0316 00:08:59.493974 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:59Z","lastTransitionTime":"2026-03-16T00:08:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:59 crc kubenswrapper[4816]: E0316 00:08:59.508988 4816 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:08:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:59Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:08:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:59Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:08:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:59Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:08:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:59Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d8fb348c-4907-4c8f-859a-735976530e03\\\",\\\"systemUUID\\\":\\\"e97abf98-298f-4589-a70a-4cfb5cb2994a\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:59Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:59 crc kubenswrapper[4816]: I0316 00:08:59.512541 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:59 crc kubenswrapper[4816]: I0316 00:08:59.512608 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:59 crc kubenswrapper[4816]: I0316 00:08:59.512619 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:59 crc kubenswrapper[4816]: I0316 00:08:59.512635 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:59 crc kubenswrapper[4816]: I0316 00:08:59.512647 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:59Z","lastTransitionTime":"2026-03-16T00:08:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:59 crc kubenswrapper[4816]: I0316 00:08:59.518269 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-psjs7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f47b6c54d5d72d827208f4d8228be260d9d19b4a720b3174349288dee2916641\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e0a510814d60106574e3cd3e612c031ac900caa0f24a12a62d7cd1f6bfda1ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://826495d8ab8f414169bf36159278da0a040fec699a74b4aebdc8f87ccdc48bd6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa5f052d32ad8f52f33ee5baea5af98a64f05b831b86ebced6c9422fb4845fcf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://86c28f1ae2f2fb61b7755ac14bc7f5346036735c75daccaa0e31b4e606cfb4c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0fac0a986edf984b0e34eda0f969205d5bc92f23d0edacb3c6dd8721eaed2fb9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf50ee7a67dc259915505ac9bbca0a709a016452caaeb0ffa14e5a5947db134e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cf50ee7a67dc259915505ac9bbca0a709a016452caaeb0ffa14e5a5947db134e\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-16T00:08:57Z\\\",\\\"message\\\":\\\"} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:57Z is after 2025-08-24T17:21:41Z]\\\\nI0316 00:08:57.626466 7110 services_controller.go:356] Processing sync for service openshift-machine-config-operator/machine-config-controller for network=default\\\\nI0316 00:08:57.626473 7110 services_controller.go:473] Services do not match for network=default, existing lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-kube-storage-version-migrator-operator/metrics_TCP_cluster\\\\\\\", UUID:\\\\\\\"7f9b8f25-db1a-4d02-a423-9afc5c2fb83c\\\\\\\", Protocol:\\\\\\\"tcp\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-kube-storage-version-migrator-operator/metrics\\\\\\\"}, Opts:services.LBOpts{Reject:false, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}},\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:56Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-psjs7_openshift-ovn-kubernetes(2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4d6bf581280690c99ddbbecd4cf4e215a12230d377978aa48acd3c05b85a3c56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1249f8b1c01db98230c10bde49bdb68f17caccc1ba0546e462df4384ca1658dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1249f8b1c01db98230c10bde49bdb68f17caccc1ba0546e462df4384ca1658dd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:08:31Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-psjs7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:59Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:59 crc kubenswrapper[4816]: E0316 00:08:59.525823 4816 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:08:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:59Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:08:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:59Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:08:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:59Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:08:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:59Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d8fb348c-4907-4c8f-859a-735976530e03\\\",\\\"systemUUID\\\":\\\"e97abf98-298f-4589-a70a-4cfb5cb2994a\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:59Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:59 crc kubenswrapper[4816]: I0316 00:08:59.529749 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:59 crc kubenswrapper[4816]: I0316 00:08:59.529797 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:59 crc kubenswrapper[4816]: I0316 00:08:59.529809 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:59 crc kubenswrapper[4816]: I0316 00:08:59.529827 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:59 crc kubenswrapper[4816]: I0316 00:08:59.529839 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:59Z","lastTransitionTime":"2026-03-16T00:08:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:59 crc kubenswrapper[4816]: I0316 00:08:59.538971 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://723654bc91ce4d40a27b14a9c4df1ed6c8dff2517f378927d7a6a96aeaa6951b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:59Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:59 crc kubenswrapper[4816]: E0316 00:08:59.542045 4816 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:08:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:59Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:08:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:59Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:08:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:59Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:08:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:59Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d8fb348c-4907-4c8f-859a-735976530e03\\\",\\\"systemUUID\\\":\\\"e97abf98-298f-4589-a70a-4cfb5cb2994a\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:59Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:59 crc kubenswrapper[4816]: I0316 00:08:59.546302 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:59 crc kubenswrapper[4816]: I0316 00:08:59.546335 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:59 crc kubenswrapper[4816]: I0316 00:08:59.546346 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:59 crc kubenswrapper[4816]: I0316 00:08:59.546364 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:59 crc kubenswrapper[4816]: I0316 00:08:59.546376 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:59Z","lastTransitionTime":"2026-03-16T00:08:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:59 crc kubenswrapper[4816]: I0316 00:08:59.552095 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://37869baff974556c22aadadfa4b528b5d6b9ad236107b4dbc945b3cdbf743f20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f9fe727708fdaca68596b7d305629ade45b061a88a81085f885d490ab99e24b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:59Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:59 crc kubenswrapper[4816]: E0316 00:08:59.560794 4816 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:08:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:59Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:08:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:59Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:08:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:59Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:08:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:59Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d8fb348c-4907-4c8f-859a-735976530e03\\\",\\\"systemUUID\\\":\\\"e97abf98-298f-4589-a70a-4cfb5cb2994a\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:59Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:59 crc kubenswrapper[4816]: I0316 00:08:59.563275 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-cnhkf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e686cd4-bddf-463e-b471-e49ea862691e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f6c3cacd4d7f6866e66b7234b280540ef2c443531de7a11f6894323ef24a9fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bwzt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:08:31Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-cnhkf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:59Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:59 crc kubenswrapper[4816]: I0316 00:08:59.565864 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:59 crc kubenswrapper[4816]: I0316 00:08:59.565907 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:59 crc kubenswrapper[4816]: I0316 00:08:59.565920 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:59 crc kubenswrapper[4816]: I0316 00:08:59.565938 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:59 crc kubenswrapper[4816]: I0316 00:08:59.565952 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:59Z","lastTransitionTime":"2026-03-16T00:08:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:59 crc kubenswrapper[4816]: I0316 00:08:59.576530 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-kbgw5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7b28986d-e33b-4876-ab6d-64d69960fb8b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://544bab5013b513f755dd9fa88cbc71021b0cfe823aa7508af994c0b1342c3646\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zgr7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0beb1aba8b8813abd98809200fa4dd348377427596c2935ceb012361634df2b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zgr7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:08:44Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-kbgw5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:59Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:59 crc kubenswrapper[4816]: E0316 00:08:59.580661 4816 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:08:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:59Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:08:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:59Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:08:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:59Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:08:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:59Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d8fb348c-4907-4c8f-859a-735976530e03\\\",\\\"systemUUID\\\":\\\"e97abf98-298f-4589-a70a-4cfb5cb2994a\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:59Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:59 crc kubenswrapper[4816]: E0316 00:08:59.580815 4816 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 16 00:08:59 crc kubenswrapper[4816]: I0316 00:08:59.595935 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c8786d1-025d-4788-bafe-c2c2eaf8e398\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://46db23088e8ba791d736f70a65bd066af80a5ec27c31332d9ac9c52530b21bfd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://da7b257af12b9ee605dc32a26d9565f866a9cafebb4936a0bde7f42ae9909f80\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2523ea43f2490afb81354be8a2840f54a3be1a8d3154196e0c8ac365d09723a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e7f3b9b1fc1648908bbf44f9ad32e1eb510b68336a8b60c8417df2f622a36e5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://29c4319e41bd025417b11cb87b682a66698ba7575801f247b1192dc8067ab66f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-16T00:07:59Z\\\",\\\"message\\\":\\\"le observer\\\\nW0316 00:07:59.373922 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0316 00:07:59.374075 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0316 00:07:59.374860 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2696044728/tls.crt::/tmp/serving-cert-2696044728/tls.key\\\\\\\"\\\\nI0316 00:07:59.560928 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0316 00:07:59.563996 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0316 00:07:59.564013 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0316 00:07:59.564035 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0316 00:07:59.564041 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0316 00:07:59.571475 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0316 00:07:59.571523 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0316 00:07:59.571531 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0316 00:07:59.571540 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0316 00:07:59.571574 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0316 00:07:59.571588 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nI0316 00:07:59.571493 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0316 00:07:59.571597 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0316 00:07:59.573484 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-16T00:07:58Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c81d0c40be9c3912864280cd98481f837ef29f69d85599b39decf5e9d7a97eff\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:50Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e39d816de31fa1c42a254ec61527496e7e85121f67007a774524167b1063cf1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e39d816de31fa1c42a254ec61527496e7e85121f67007a774524167b1063cf1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:06:47Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:59Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:59 crc kubenswrapper[4816]: I0316 00:08:59.608152 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://516852108b3d17a15676573ff080d3d94fa48d8076ed0404a8d827e715c2555e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:59Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:59 crc kubenswrapper[4816]: I0316 00:08:59.627018 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:59Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:59 crc kubenswrapper[4816]: I0316 00:08:59.648406 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-mt7bq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"03ef49f1-0c6a-443a-8df3-2db339c562ed\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d31459f93f54cc1cd99638b1f108b8997b34f246cf8c65e5462e672f04aaa7c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8979f751ec4797986924a3efea84d02bc1c1cf303e1e992f349c9083f2fe4f08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8979f751ec4797986924a3efea84d02bc1c1cf303e1e992f349c9083f2fe4f08\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa87691b0dc314209d88d1f4d5b2227abc1de527e8111831bb8243ba6192c300\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fa87691b0dc314209d88d1f4d5b2227abc1de527e8111831bb8243ba6192c300\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:08:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://33f82b993e3d729a5830e88136e74687c930d6ad1a15c64f003f311fda418bce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://33f82b993e3d729a5830e88136e74687c930d6ad1a15c64f003f311fda418bce\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:08:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b640c26797535db11051d50aefd13b7b2759cfa19d163f53764b0fe8710cefdc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b640c26797535db11051d50aefd13b7b2759cfa19d163f53764b0fe8710cefdc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:08:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe5359762a36b39c9f8ae939a1b9d094c42ffab59c4b9743a83b4391c171a512\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fe5359762a36b39c9f8ae939a1b9d094c42ffab59c4b9743a83b4391c171a512\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:08:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc31f057b776cf1925d265b64c7a9a9fb0993e4f3ffff530a48f28865e1c2954\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dc31f057b776cf1925d265b64c7a9a9fb0993e4f3ffff530a48f28865e1c2954\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:08:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:08:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-mt7bq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:59Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:59 crc kubenswrapper[4816]: I0316 00:08:59.664745 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-lhpbn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6ec6a8ee-efd9-45df-bb35-706fcc90ebe9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a34b33696ef85102800de34359b2f8b4bf11c03ca6347dde766456bcea20e0e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c2x8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:08:38Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-lhpbn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:59Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:00 crc kubenswrapper[4816]: I0316 00:09:00.667238 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jqsjn" Mar 16 00:09:00 crc kubenswrapper[4816]: I0316 00:09:00.667323 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 16 00:09:00 crc kubenswrapper[4816]: E0316 00:09:00.667389 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jqsjn" podUID="84360ef9-0450-44c5-80eb-eab1bf8e808b" Mar 16 00:09:00 crc kubenswrapper[4816]: I0316 00:09:00.667437 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 16 00:09:00 crc kubenswrapper[4816]: E0316 00:09:00.667457 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 16 00:09:00 crc kubenswrapper[4816]: I0316 00:09:00.667317 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 16 00:09:00 crc kubenswrapper[4816]: E0316 00:09:00.667679 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 16 00:09:00 crc kubenswrapper[4816]: E0316 00:09:00.667916 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 16 00:09:00 crc kubenswrapper[4816]: I0316 00:09:00.682457 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-crc"] Mar 16 00:09:01 crc kubenswrapper[4816]: I0316 00:09:01.085499 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/84360ef9-0450-44c5-80eb-eab1bf8e808b-metrics-certs\") pod \"network-metrics-daemon-jqsjn\" (UID: \"84360ef9-0450-44c5-80eb-eab1bf8e808b\") " pod="openshift-multus/network-metrics-daemon-jqsjn" Mar 16 00:09:01 crc kubenswrapper[4816]: E0316 00:09:01.085828 4816 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 16 00:09:01 crc kubenswrapper[4816]: E0316 00:09:01.085951 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/84360ef9-0450-44c5-80eb-eab1bf8e808b-metrics-certs podName:84360ef9-0450-44c5-80eb-eab1bf8e808b nodeName:}" failed. No retries permitted until 2026-03-16 00:09:17.085924331 +0000 UTC m=+150.182224314 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/84360ef9-0450-44c5-80eb-eab1bf8e808b-metrics-certs") pod "network-metrics-daemon-jqsjn" (UID: "84360ef9-0450-44c5-80eb-eab1bf8e808b") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 16 00:09:01 crc kubenswrapper[4816]: I0316 00:09:01.299146 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 16 00:09:01 crc kubenswrapper[4816]: I0316 00:09:01.323479 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c8786d1-025d-4788-bafe-c2c2eaf8e398\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:09:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:09:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://46db23088e8ba791d736f70a65bd066af80a5ec27c31332d9ac9c52530b21bfd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://da7b257af12b9ee605dc32a26d9565f866a9cafebb4936a0bde7f42ae9909f80\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2523ea43f2490afb81354be8a2840f54a3be1a8d3154196e0c8ac365d09723a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e7f3b9b1fc1648908bbf44f9ad32e1eb510b68336a8b60c8417df2f622a36e5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://29c4319e41bd025417b11cb87b682a66698ba7575801f247b1192dc8067ab66f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-16T00:07:59Z\\\",\\\"message\\\":\\\"le observer\\\\nW0316 00:07:59.373922 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0316 00:07:59.374075 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0316 00:07:59.374860 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2696044728/tls.crt::/tmp/serving-cert-2696044728/tls.key\\\\\\\"\\\\nI0316 00:07:59.560928 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0316 00:07:59.563996 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0316 00:07:59.564013 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0316 00:07:59.564035 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0316 00:07:59.564041 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0316 00:07:59.571475 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0316 00:07:59.571523 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0316 00:07:59.571531 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0316 00:07:59.571540 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0316 00:07:59.571574 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0316 00:07:59.571588 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nI0316 00:07:59.571493 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0316 00:07:59.571597 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0316 00:07:59.573484 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-16T00:07:58Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c81d0c40be9c3912864280cd98481f837ef29f69d85599b39decf5e9d7a97eff\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:50Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e39d816de31fa1c42a254ec61527496e7e85121f67007a774524167b1063cf1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e39d816de31fa1c42a254ec61527496e7e85121f67007a774524167b1063cf1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:06:47Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:01Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:01 crc kubenswrapper[4816]: I0316 00:09:01.345365 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://516852108b3d17a15676573ff080d3d94fa48d8076ed0404a8d827e715c2555e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:01Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:01 crc kubenswrapper[4816]: I0316 00:09:01.364503 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:01Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:01 crc kubenswrapper[4816]: I0316 00:09:01.383756 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-mt7bq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"03ef49f1-0c6a-443a-8df3-2db339c562ed\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d31459f93f54cc1cd99638b1f108b8997b34f246cf8c65e5462e672f04aaa7c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8979f751ec4797986924a3efea84d02bc1c1cf303e1e992f349c9083f2fe4f08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8979f751ec4797986924a3efea84d02bc1c1cf303e1e992f349c9083f2fe4f08\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa87691b0dc314209d88d1f4d5b2227abc1de527e8111831bb8243ba6192c300\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fa87691b0dc314209d88d1f4d5b2227abc1de527e8111831bb8243ba6192c300\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:08:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://33f82b993e3d729a5830e88136e74687c930d6ad1a15c64f003f311fda418bce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://33f82b993e3d729a5830e88136e74687c930d6ad1a15c64f003f311fda418bce\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:08:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b640c26797535db11051d50aefd13b7b2759cfa19d163f53764b0fe8710cefdc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b640c26797535db11051d50aefd13b7b2759cfa19d163f53764b0fe8710cefdc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:08:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe5359762a36b39c9f8ae939a1b9d094c42ffab59c4b9743a83b4391c171a512\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fe5359762a36b39c9f8ae939a1b9d094c42ffab59c4b9743a83b4391c171a512\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:08:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc31f057b776cf1925d265b64c7a9a9fb0993e4f3ffff530a48f28865e1c2954\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dc31f057b776cf1925d265b64c7a9a9fb0993e4f3ffff530a48f28865e1c2954\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:08:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:08:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-mt7bq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:01Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:01 crc kubenswrapper[4816]: I0316 00:09:01.397386 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-lhpbn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6ec6a8ee-efd9-45df-bb35-706fcc90ebe9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a34b33696ef85102800de34359b2f8b4bf11c03ca6347dde766456bcea20e0e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c2x8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:08:38Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-lhpbn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:01Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:01 crc kubenswrapper[4816]: I0316 00:09:01.413732 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7a695b85-75f5-42bb-a13c-aa20512355e4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b6e7e97dbf041ed59e094f9be34956cb28d4774022777f2371a2eb752937f551\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fea427361067bc1a9d56b7b6699b072b5cdeee8345bf6618b8d81c6848f62098\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-16T00:07:20Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0316 00:06:49.819775 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0316 00:06:49.823008 1 observer_polling.go:159] Starting file observer\\\\nI0316 00:06:49.860579 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0316 00:06:49.866006 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nF0316 00:07:20.134248 1 cmd.go:179] failed checking apiserver connectivity: Get \\\\\\\"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/openshift-kube-controller-manager/leases/cluster-policy-controller-lock\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:07:19Z is after 2026-02-23T05:33:13Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:49Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5f3dfc8f46079b51e52802920a734bf796a00db2cf42501b9f7e202e0e9bc2ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5afc1a568801d941c814fe5e4eb91df609672a9431082d578d600f68f669aa3f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c60304164aa2f8e740ab25d779ef9b0aa66f6acca476bae45a13f4be44e37e33\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:06:47Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:01Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:01 crc kubenswrapper[4816]: I0316 00:09:01.433948 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3a9d51fa-9450-4f18-be16-0a93d64889b4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://94c8e0e7b0ecd4d0f525f55538bff0b01653544be87c923fc152d4d982da7021\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://02c81b3d94044c1f4f344dd8306cacf8db533a662535ddb0a6a4ffc1bb4cf1f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://37dea82cd38ef9cfbda3626b789b21af2ada4b4a58bc2ecc6ce55eda60052277\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e44b0bbe4cebe4e52726f3f654f8b4a2953abcd0ad32db752c4641b3f0be1b00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e44b0bbe4cebe4e52726f3f654f8b4a2953abcd0ad32db752c4641b3f0be1b00\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:48Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:06:47Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:01Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:01 crc kubenswrapper[4816]: I0316 00:09:01.452270 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:01Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:01 crc kubenswrapper[4816]: I0316 00:09:01.472305 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:01Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:01 crc kubenswrapper[4816]: I0316 00:09:01.488817 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-jqsjn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"84360ef9-0450-44c5-80eb-eab1bf8e808b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxldb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxldb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:08:45Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-jqsjn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:01Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:01 crc kubenswrapper[4816]: I0316 00:09:01.522082 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"49e35ef6-7a6d-438f-b598-4445d73db379\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fbe90b49c5d42bb9d6d5d1b8f1d02b571403dd3f39d3118081351d72c4910914\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://508e6dbe6f2f1392d16501e09ee2ec523a3c2a245cd96c6ab773e26e83b8bb6a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://15b5a3223ff0fb5b96e5b87ee836add66498165759f4d6ed8cb6413f091967e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://603bdcbd4a6af5fda7ca09e4823e0db8d7ec3e3eb38eddf9f062a14b433cb094\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8b061f16a65a27b9a5ef66231376b70fec084479a991b7fdd430957df76da32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b19c46be8bb0c6604bb4f702d14472b782280ce716a3b5fe87a4fa4f1b15b174\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b19c46be8bb0c6604bb4f702d14472b782280ce716a3b5fe87a4fa4f1b15b174\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://baceacd0418c0bbcbd2e875008286443b263336ba956d8e4584147ed949b3f69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://baceacd0418c0bbcbd2e875008286443b263336ba956d8e4584147ed949b3f69\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:49Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://24e7adc2ddce8a281cd270db47b3528d2a4965ab6d41a14ecb5aaea02daf0c45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://24e7adc2ddce8a281cd270db47b3528d2a4965ab6d41a14ecb5aaea02daf0c45\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:06:47Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:01Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:01 crc kubenswrapper[4816]: I0316 00:09:01.545287 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-szscw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e9789e58-12c8-4831-9401-af48a3e92209\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e6f0b7e8f95bbbf3b90133349b8b13d2784582a5d95dae8dd159328b570ae962\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mxf6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:08:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-szscw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:01Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:01 crc kubenswrapper[4816]: I0316 00:09:01.563831 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-jrdcz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dd08ece2-7636-4966-973a-e96a34b70b53\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6dd22d6548986bf126a8ce6b30c7c11d8c37bd0c03539a55cc28fb7debcfc26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trwbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7003a4592a48f87ab63e9f5637c7665040f9784a7c8f599c82ed61270ddb793c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trwbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:08:31Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-jrdcz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:01Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:01 crc kubenswrapper[4816]: I0316 00:09:01.594626 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-psjs7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f47b6c54d5d72d827208f4d8228be260d9d19b4a720b3174349288dee2916641\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e0a510814d60106574e3cd3e612c031ac900caa0f24a12a62d7cd1f6bfda1ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://826495d8ab8f414169bf36159278da0a040fec699a74b4aebdc8f87ccdc48bd6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa5f052d32ad8f52f33ee5baea5af98a64f05b831b86ebced6c9422fb4845fcf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://86c28f1ae2f2fb61b7755ac14bc7f5346036735c75daccaa0e31b4e606cfb4c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0fac0a986edf984b0e34eda0f969205d5bc92f23d0edacb3c6dd8721eaed2fb9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf50ee7a67dc259915505ac9bbca0a709a016452caaeb0ffa14e5a5947db134e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cf50ee7a67dc259915505ac9bbca0a709a016452caaeb0ffa14e5a5947db134e\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-16T00:08:57Z\\\",\\\"message\\\":\\\"} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:57Z is after 2025-08-24T17:21:41Z]\\\\nI0316 00:08:57.626466 7110 services_controller.go:356] Processing sync for service openshift-machine-config-operator/machine-config-controller for network=default\\\\nI0316 00:08:57.626473 7110 services_controller.go:473] Services do not match for network=default, existing lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-kube-storage-version-migrator-operator/metrics_TCP_cluster\\\\\\\", UUID:\\\\\\\"7f9b8f25-db1a-4d02-a423-9afc5c2fb83c\\\\\\\", Protocol:\\\\\\\"tcp\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-kube-storage-version-migrator-operator/metrics\\\\\\\"}, Opts:services.LBOpts{Reject:false, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}},\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:56Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-psjs7_openshift-ovn-kubernetes(2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4d6bf581280690c99ddbbecd4cf4e215a12230d377978aa48acd3c05b85a3c56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1249f8b1c01db98230c10bde49bdb68f17caccc1ba0546e462df4384ca1658dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1249f8b1c01db98230c10bde49bdb68f17caccc1ba0546e462df4384ca1658dd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:08:31Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-psjs7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:01Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:01 crc kubenswrapper[4816]: I0316 00:09:01.617986 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://723654bc91ce4d40a27b14a9c4df1ed6c8dff2517f378927d7a6a96aeaa6951b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:01Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:01 crc kubenswrapper[4816]: I0316 00:09:01.639396 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://37869baff974556c22aadadfa4b528b5d6b9ad236107b4dbc945b3cdbf743f20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f9fe727708fdaca68596b7d305629ade45b061a88a81085f885d490ab99e24b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:01Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:01 crc kubenswrapper[4816]: I0316 00:09:01.656007 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-cnhkf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e686cd4-bddf-463e-b471-e49ea862691e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f6c3cacd4d7f6866e66b7234b280540ef2c443531de7a11f6894323ef24a9fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bwzt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:08:31Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-cnhkf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:01Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:01 crc kubenswrapper[4816]: I0316 00:09:01.676894 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-kbgw5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7b28986d-e33b-4876-ab6d-64d69960fb8b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://544bab5013b513f755dd9fa88cbc71021b0cfe823aa7508af994c0b1342c3646\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zgr7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0beb1aba8b8813abd98809200fa4dd348377427596c2935ceb012361634df2b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zgr7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:08:44Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-kbgw5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:01Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:02 crc kubenswrapper[4816]: I0316 00:09:02.667696 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jqsjn" Mar 16 00:09:02 crc kubenswrapper[4816]: I0316 00:09:02.667724 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 16 00:09:02 crc kubenswrapper[4816]: E0316 00:09:02.668208 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jqsjn" podUID="84360ef9-0450-44c5-80eb-eab1bf8e808b" Mar 16 00:09:02 crc kubenswrapper[4816]: I0316 00:09:02.667787 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 16 00:09:02 crc kubenswrapper[4816]: I0316 00:09:02.667770 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 16 00:09:02 crc kubenswrapper[4816]: E0316 00:09:02.668395 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 16 00:09:02 crc kubenswrapper[4816]: E0316 00:09:02.668471 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 16 00:09:02 crc kubenswrapper[4816]: E0316 00:09:02.668609 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 16 00:09:02 crc kubenswrapper[4816]: E0316 00:09:02.788014 4816 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 16 00:09:04 crc kubenswrapper[4816]: I0316 00:09:04.666931 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jqsjn" Mar 16 00:09:04 crc kubenswrapper[4816]: I0316 00:09:04.667007 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 16 00:09:04 crc kubenswrapper[4816]: E0316 00:09:04.667111 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jqsjn" podUID="84360ef9-0450-44c5-80eb-eab1bf8e808b" Mar 16 00:09:04 crc kubenswrapper[4816]: I0316 00:09:04.667131 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 16 00:09:04 crc kubenswrapper[4816]: I0316 00:09:04.667157 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 16 00:09:04 crc kubenswrapper[4816]: E0316 00:09:04.667259 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 16 00:09:04 crc kubenswrapper[4816]: E0316 00:09:04.667354 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 16 00:09:04 crc kubenswrapper[4816]: E0316 00:09:04.667487 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 16 00:09:06 crc kubenswrapper[4816]: I0316 00:09:06.667045 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jqsjn" Mar 16 00:09:06 crc kubenswrapper[4816]: I0316 00:09:06.667088 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 16 00:09:06 crc kubenswrapper[4816]: I0316 00:09:06.667105 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 16 00:09:06 crc kubenswrapper[4816]: I0316 00:09:06.667059 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 16 00:09:06 crc kubenswrapper[4816]: E0316 00:09:06.667385 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 16 00:09:06 crc kubenswrapper[4816]: E0316 00:09:06.667482 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jqsjn" podUID="84360ef9-0450-44c5-80eb-eab1bf8e808b" Mar 16 00:09:06 crc kubenswrapper[4816]: E0316 00:09:06.667335 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 16 00:09:06 crc kubenswrapper[4816]: E0316 00:09:06.667667 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 16 00:09:07 crc kubenswrapper[4816]: I0316 00:09:07.692388 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c8786d1-025d-4788-bafe-c2c2eaf8e398\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:09:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:09:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://46db23088e8ba791d736f70a65bd066af80a5ec27c31332d9ac9c52530b21bfd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://da7b257af12b9ee605dc32a26d9565f866a9cafebb4936a0bde7f42ae9909f80\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2523ea43f2490afb81354be8a2840f54a3be1a8d3154196e0c8ac365d09723a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e7f3b9b1fc1648908bbf44f9ad32e1eb510b68336a8b60c8417df2f622a36e5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://29c4319e41bd025417b11cb87b682a66698ba7575801f247b1192dc8067ab66f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-16T00:07:59Z\\\",\\\"message\\\":\\\"le observer\\\\nW0316 00:07:59.373922 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0316 00:07:59.374075 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0316 00:07:59.374860 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2696044728/tls.crt::/tmp/serving-cert-2696044728/tls.key\\\\\\\"\\\\nI0316 00:07:59.560928 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0316 00:07:59.563996 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0316 00:07:59.564013 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0316 00:07:59.564035 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0316 00:07:59.564041 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0316 00:07:59.571475 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0316 00:07:59.571523 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0316 00:07:59.571531 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0316 00:07:59.571540 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0316 00:07:59.571574 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0316 00:07:59.571588 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nI0316 00:07:59.571493 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0316 00:07:59.571597 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0316 00:07:59.573484 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-16T00:07:58Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c81d0c40be9c3912864280cd98481f837ef29f69d85599b39decf5e9d7a97eff\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:50Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e39d816de31fa1c42a254ec61527496e7e85121f67007a774524167b1063cf1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e39d816de31fa1c42a254ec61527496e7e85121f67007a774524167b1063cf1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:06:47Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:07Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:07 crc kubenswrapper[4816]: I0316 00:09:07.711286 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://516852108b3d17a15676573ff080d3d94fa48d8076ed0404a8d827e715c2555e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:07Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:07 crc kubenswrapper[4816]: I0316 00:09:07.732251 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:07Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:07 crc kubenswrapper[4816]: I0316 00:09:07.756159 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-mt7bq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"03ef49f1-0c6a-443a-8df3-2db339c562ed\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d31459f93f54cc1cd99638b1f108b8997b34f246cf8c65e5462e672f04aaa7c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8979f751ec4797986924a3efea84d02bc1c1cf303e1e992f349c9083f2fe4f08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8979f751ec4797986924a3efea84d02bc1c1cf303e1e992f349c9083f2fe4f08\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa87691b0dc314209d88d1f4d5b2227abc1de527e8111831bb8243ba6192c300\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fa87691b0dc314209d88d1f4d5b2227abc1de527e8111831bb8243ba6192c300\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:08:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://33f82b993e3d729a5830e88136e74687c930d6ad1a15c64f003f311fda418bce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://33f82b993e3d729a5830e88136e74687c930d6ad1a15c64f003f311fda418bce\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:08:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b640c26797535db11051d50aefd13b7b2759cfa19d163f53764b0fe8710cefdc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b640c26797535db11051d50aefd13b7b2759cfa19d163f53764b0fe8710cefdc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:08:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe5359762a36b39c9f8ae939a1b9d094c42ffab59c4b9743a83b4391c171a512\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fe5359762a36b39c9f8ae939a1b9d094c42ffab59c4b9743a83b4391c171a512\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:08:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc31f057b776cf1925d265b64c7a9a9fb0993e4f3ffff530a48f28865e1c2954\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dc31f057b776cf1925d265b64c7a9a9fb0993e4f3ffff530a48f28865e1c2954\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:08:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:08:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-mt7bq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:07Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:07 crc kubenswrapper[4816]: I0316 00:09:07.773199 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-lhpbn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6ec6a8ee-efd9-45df-bb35-706fcc90ebe9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a34b33696ef85102800de34359b2f8b4bf11c03ca6347dde766456bcea20e0e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c2x8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:08:38Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-lhpbn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:07Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:07 crc kubenswrapper[4816]: E0316 00:09:07.789094 4816 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 16 00:09:07 crc kubenswrapper[4816]: I0316 00:09:07.796527 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7a695b85-75f5-42bb-a13c-aa20512355e4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b6e7e97dbf041ed59e094f9be34956cb28d4774022777f2371a2eb752937f551\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fea427361067bc1a9d56b7b6699b072b5cdeee8345bf6618b8d81c6848f62098\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-16T00:07:20Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0316 00:06:49.819775 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0316 00:06:49.823008 1 observer_polling.go:159] Starting file observer\\\\nI0316 00:06:49.860579 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0316 00:06:49.866006 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nF0316 00:07:20.134248 1 cmd.go:179] failed checking apiserver connectivity: Get \\\\\\\"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/openshift-kube-controller-manager/leases/cluster-policy-controller-lock\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:07:19Z is after 2026-02-23T05:33:13Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:49Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5f3dfc8f46079b51e52802920a734bf796a00db2cf42501b9f7e202e0e9bc2ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5afc1a568801d941c814fe5e4eb91df609672a9431082d578d600f68f669aa3f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c60304164aa2f8e740ab25d779ef9b0aa66f6acca476bae45a13f4be44e37e33\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:06:47Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:07Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:07 crc kubenswrapper[4816]: I0316 00:09:07.811479 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3a9d51fa-9450-4f18-be16-0a93d64889b4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://94c8e0e7b0ecd4d0f525f55538bff0b01653544be87c923fc152d4d982da7021\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://02c81b3d94044c1f4f344dd8306cacf8db533a662535ddb0a6a4ffc1bb4cf1f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://37dea82cd38ef9cfbda3626b789b21af2ada4b4a58bc2ecc6ce55eda60052277\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e44b0bbe4cebe4e52726f3f654f8b4a2953abcd0ad32db752c4641b3f0be1b00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e44b0bbe4cebe4e52726f3f654f8b4a2953abcd0ad32db752c4641b3f0be1b00\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:48Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:06:47Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:07Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:07 crc kubenswrapper[4816]: I0316 00:09:07.824753 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:07Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:07 crc kubenswrapper[4816]: I0316 00:09:07.838691 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:07Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:07 crc kubenswrapper[4816]: I0316 00:09:07.854933 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-jqsjn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"84360ef9-0450-44c5-80eb-eab1bf8e808b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxldb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxldb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:08:45Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-jqsjn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:07Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:07 crc kubenswrapper[4816]: I0316 00:09:07.882024 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"49e35ef6-7a6d-438f-b598-4445d73db379\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fbe90b49c5d42bb9d6d5d1b8f1d02b571403dd3f39d3118081351d72c4910914\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://508e6dbe6f2f1392d16501e09ee2ec523a3c2a245cd96c6ab773e26e83b8bb6a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://15b5a3223ff0fb5b96e5b87ee836add66498165759f4d6ed8cb6413f091967e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://603bdcbd4a6af5fda7ca09e4823e0db8d7ec3e3eb38eddf9f062a14b433cb094\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8b061f16a65a27b9a5ef66231376b70fec084479a991b7fdd430957df76da32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b19c46be8bb0c6604bb4f702d14472b782280ce716a3b5fe87a4fa4f1b15b174\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b19c46be8bb0c6604bb4f702d14472b782280ce716a3b5fe87a4fa4f1b15b174\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://baceacd0418c0bbcbd2e875008286443b263336ba956d8e4584147ed949b3f69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://baceacd0418c0bbcbd2e875008286443b263336ba956d8e4584147ed949b3f69\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:49Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://24e7adc2ddce8a281cd270db47b3528d2a4965ab6d41a14ecb5aaea02daf0c45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://24e7adc2ddce8a281cd270db47b3528d2a4965ab6d41a14ecb5aaea02daf0c45\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:06:47Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:07Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:07 crc kubenswrapper[4816]: I0316 00:09:07.903457 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-szscw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e9789e58-12c8-4831-9401-af48a3e92209\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e6f0b7e8f95bbbf3b90133349b8b13d2784582a5d95dae8dd159328b570ae962\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mxf6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:08:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-szscw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:07Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:07 crc kubenswrapper[4816]: I0316 00:09:07.921840 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-jrdcz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dd08ece2-7636-4966-973a-e96a34b70b53\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6dd22d6548986bf126a8ce6b30c7c11d8c37bd0c03539a55cc28fb7debcfc26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trwbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7003a4592a48f87ab63e9f5637c7665040f9784a7c8f599c82ed61270ddb793c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trwbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:08:31Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-jrdcz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:07Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:07 crc kubenswrapper[4816]: I0316 00:09:07.955418 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-psjs7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f47b6c54d5d72d827208f4d8228be260d9d19b4a720b3174349288dee2916641\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e0a510814d60106574e3cd3e612c031ac900caa0f24a12a62d7cd1f6bfda1ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://826495d8ab8f414169bf36159278da0a040fec699a74b4aebdc8f87ccdc48bd6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa5f052d32ad8f52f33ee5baea5af98a64f05b831b86ebced6c9422fb4845fcf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://86c28f1ae2f2fb61b7755ac14bc7f5346036735c75daccaa0e31b4e606cfb4c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0fac0a986edf984b0e34eda0f969205d5bc92f23d0edacb3c6dd8721eaed2fb9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf50ee7a67dc259915505ac9bbca0a709a016452caaeb0ffa14e5a5947db134e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cf50ee7a67dc259915505ac9bbca0a709a016452caaeb0ffa14e5a5947db134e\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-16T00:08:57Z\\\",\\\"message\\\":\\\"} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:57Z is after 2025-08-24T17:21:41Z]\\\\nI0316 00:08:57.626466 7110 services_controller.go:356] Processing sync for service openshift-machine-config-operator/machine-config-controller for network=default\\\\nI0316 00:08:57.626473 7110 services_controller.go:473] Services do not match for network=default, existing lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-kube-storage-version-migrator-operator/metrics_TCP_cluster\\\\\\\", UUID:\\\\\\\"7f9b8f25-db1a-4d02-a423-9afc5c2fb83c\\\\\\\", Protocol:\\\\\\\"tcp\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-kube-storage-version-migrator-operator/metrics\\\\\\\"}, Opts:services.LBOpts{Reject:false, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}},\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:56Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-psjs7_openshift-ovn-kubernetes(2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4d6bf581280690c99ddbbecd4cf4e215a12230d377978aa48acd3c05b85a3c56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1249f8b1c01db98230c10bde49bdb68f17caccc1ba0546e462df4384ca1658dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1249f8b1c01db98230c10bde49bdb68f17caccc1ba0546e462df4384ca1658dd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:08:31Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-psjs7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:07Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:07 crc kubenswrapper[4816]: I0316 00:09:07.981481 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://723654bc91ce4d40a27b14a9c4df1ed6c8dff2517f378927d7a6a96aeaa6951b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:07Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:08 crc kubenswrapper[4816]: I0316 00:09:08.003088 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://37869baff974556c22aadadfa4b528b5d6b9ad236107b4dbc945b3cdbf743f20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f9fe727708fdaca68596b7d305629ade45b061a88a81085f885d490ab99e24b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:08Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:08 crc kubenswrapper[4816]: I0316 00:09:08.020073 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-cnhkf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e686cd4-bddf-463e-b471-e49ea862691e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f6c3cacd4d7f6866e66b7234b280540ef2c443531de7a11f6894323ef24a9fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bwzt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:08:31Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-cnhkf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:08Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:08 crc kubenswrapper[4816]: I0316 00:09:08.036377 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-kbgw5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7b28986d-e33b-4876-ab6d-64d69960fb8b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://544bab5013b513f755dd9fa88cbc71021b0cfe823aa7508af994c0b1342c3646\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zgr7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0beb1aba8b8813abd98809200fa4dd348377427596c2935ceb012361634df2b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zgr7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:08:44Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-kbgw5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:08Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:08 crc kubenswrapper[4816]: I0316 00:09:08.666978 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 16 00:09:08 crc kubenswrapper[4816]: I0316 00:09:08.667008 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jqsjn" Mar 16 00:09:08 crc kubenswrapper[4816]: E0316 00:09:08.667116 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 16 00:09:08 crc kubenswrapper[4816]: I0316 00:09:08.667167 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 16 00:09:08 crc kubenswrapper[4816]: E0316 00:09:08.667297 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jqsjn" podUID="84360ef9-0450-44c5-80eb-eab1bf8e808b" Mar 16 00:09:08 crc kubenswrapper[4816]: I0316 00:09:08.667316 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 16 00:09:08 crc kubenswrapper[4816]: E0316 00:09:08.667368 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 16 00:09:08 crc kubenswrapper[4816]: E0316 00:09:08.667421 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 16 00:09:09 crc kubenswrapper[4816]: I0316 00:09:09.836607 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:09:09 crc kubenswrapper[4816]: I0316 00:09:09.836675 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:09:09 crc kubenswrapper[4816]: I0316 00:09:09.836692 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:09:09 crc kubenswrapper[4816]: I0316 00:09:09.836718 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:09:09 crc kubenswrapper[4816]: I0316 00:09:09.836736 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:09:09Z","lastTransitionTime":"2026-03-16T00:09:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:09:09 crc kubenswrapper[4816]: E0316 00:09:09.858092 4816 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:09:09Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:09:09Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:09:09Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:09:09Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:09:09Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:09:09Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:09:09Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:09:09Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d8fb348c-4907-4c8f-859a-735976530e03\\\",\\\"systemUUID\\\":\\\"e97abf98-298f-4589-a70a-4cfb5cb2994a\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:09Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:09 crc kubenswrapper[4816]: I0316 00:09:09.862070 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:09:09 crc kubenswrapper[4816]: I0316 00:09:09.862116 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:09:09 crc kubenswrapper[4816]: I0316 00:09:09.862135 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:09:09 crc kubenswrapper[4816]: I0316 00:09:09.862156 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:09:09 crc kubenswrapper[4816]: I0316 00:09:09.862173 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:09:09Z","lastTransitionTime":"2026-03-16T00:09:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:09:09 crc kubenswrapper[4816]: E0316 00:09:09.882014 4816 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:09:09Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:09:09Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:09:09Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:09:09Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:09:09Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:09:09Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:09:09Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:09:09Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d8fb348c-4907-4c8f-859a-735976530e03\\\",\\\"systemUUID\\\":\\\"e97abf98-298f-4589-a70a-4cfb5cb2994a\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:09Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:09 crc kubenswrapper[4816]: I0316 00:09:09.887391 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:09:09 crc kubenswrapper[4816]: I0316 00:09:09.887744 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:09:09 crc kubenswrapper[4816]: I0316 00:09:09.887946 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:09:09 crc kubenswrapper[4816]: I0316 00:09:09.888145 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:09:09 crc kubenswrapper[4816]: I0316 00:09:09.888332 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:09:09Z","lastTransitionTime":"2026-03-16T00:09:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:09:09 crc kubenswrapper[4816]: E0316 00:09:09.907643 4816 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:09:09Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:09:09Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:09:09Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:09:09Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:09:09Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:09:09Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:09:09Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:09:09Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d8fb348c-4907-4c8f-859a-735976530e03\\\",\\\"systemUUID\\\":\\\"e97abf98-298f-4589-a70a-4cfb5cb2994a\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:09Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:09 crc kubenswrapper[4816]: I0316 00:09:09.912696 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:09:09 crc kubenswrapper[4816]: I0316 00:09:09.912860 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:09:09 crc kubenswrapper[4816]: I0316 00:09:09.912982 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:09:09 crc kubenswrapper[4816]: I0316 00:09:09.913072 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:09:09 crc kubenswrapper[4816]: I0316 00:09:09.913168 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:09:09Z","lastTransitionTime":"2026-03-16T00:09:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:09:09 crc kubenswrapper[4816]: E0316 00:09:09.928276 4816 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:09:09Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:09:09Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:09:09Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:09:09Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:09:09Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:09:09Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:09:09Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:09:09Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d8fb348c-4907-4c8f-859a-735976530e03\\\",\\\"systemUUID\\\":\\\"e97abf98-298f-4589-a70a-4cfb5cb2994a\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:09Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:09 crc kubenswrapper[4816]: I0316 00:09:09.932455 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:09:09 crc kubenswrapper[4816]: I0316 00:09:09.932505 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:09:09 crc kubenswrapper[4816]: I0316 00:09:09.932520 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:09:09 crc kubenswrapper[4816]: I0316 00:09:09.932540 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:09:09 crc kubenswrapper[4816]: I0316 00:09:09.932586 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:09:09Z","lastTransitionTime":"2026-03-16T00:09:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:09:09 crc kubenswrapper[4816]: E0316 00:09:09.952629 4816 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:09:09Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:09:09Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:09:09Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:09:09Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:09:09Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:09:09Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:09:09Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:09:09Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d8fb348c-4907-4c8f-859a-735976530e03\\\",\\\"systemUUID\\\":\\\"e97abf98-298f-4589-a70a-4cfb5cb2994a\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:09Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:09 crc kubenswrapper[4816]: E0316 00:09:09.952861 4816 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 16 00:09:10 crc kubenswrapper[4816]: I0316 00:09:10.667251 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 16 00:09:10 crc kubenswrapper[4816]: I0316 00:09:10.667402 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 16 00:09:10 crc kubenswrapper[4816]: E0316 00:09:10.667626 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 16 00:09:10 crc kubenswrapper[4816]: I0316 00:09:10.667717 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jqsjn" Mar 16 00:09:10 crc kubenswrapper[4816]: I0316 00:09:10.667731 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 16 00:09:10 crc kubenswrapper[4816]: E0316 00:09:10.667917 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jqsjn" podUID="84360ef9-0450-44c5-80eb-eab1bf8e808b" Mar 16 00:09:10 crc kubenswrapper[4816]: E0316 00:09:10.668097 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 16 00:09:10 crc kubenswrapper[4816]: E0316 00:09:10.668611 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 16 00:09:12 crc kubenswrapper[4816]: I0316 00:09:12.514502 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 16 00:09:12 crc kubenswrapper[4816]: E0316 00:09:12.515200 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-16 00:10:16.515148953 +0000 UTC m=+209.611448946 (durationBeforeRetry 1m4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:09:12 crc kubenswrapper[4816]: I0316 00:09:12.616328 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 16 00:09:12 crc kubenswrapper[4816]: I0316 00:09:12.616786 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 16 00:09:12 crc kubenswrapper[4816]: E0316 00:09:12.616509 4816 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 16 00:09:12 crc kubenswrapper[4816]: E0316 00:09:12.616966 4816 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 16 00:09:12 crc kubenswrapper[4816]: E0316 00:09:12.617008 4816 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 16 00:09:12 crc kubenswrapper[4816]: E0316 00:09:12.617032 4816 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 16 00:09:12 crc kubenswrapper[4816]: I0316 00:09:12.616892 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 16 00:09:12 crc kubenswrapper[4816]: E0316 00:09:12.617111 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-16 00:10:16.617052604 +0000 UTC m=+209.713352797 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 16 00:09:12 crc kubenswrapper[4816]: E0316 00:09:12.617192 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-16 00:10:16.617160827 +0000 UTC m=+209.713460810 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 16 00:09:12 crc kubenswrapper[4816]: I0316 00:09:12.617242 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 16 00:09:12 crc kubenswrapper[4816]: E0316 00:09:12.617335 4816 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 16 00:09:12 crc kubenswrapper[4816]: E0316 00:09:12.617389 4816 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 16 00:09:12 crc kubenswrapper[4816]: E0316 00:09:12.617445 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-16 00:10:16.617430745 +0000 UTC m=+209.713730728 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 16 00:09:12 crc kubenswrapper[4816]: E0316 00:09:12.617402 4816 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 16 00:09:12 crc kubenswrapper[4816]: E0316 00:09:12.617657 4816 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 16 00:09:12 crc kubenswrapper[4816]: E0316 00:09:12.617738 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-16 00:10:16.617727334 +0000 UTC m=+209.714027287 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 16 00:09:12 crc kubenswrapper[4816]: I0316 00:09:12.667362 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 16 00:09:12 crc kubenswrapper[4816]: I0316 00:09:12.667370 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 16 00:09:12 crc kubenswrapper[4816]: I0316 00:09:12.667399 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 16 00:09:12 crc kubenswrapper[4816]: I0316 00:09:12.667423 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jqsjn" Mar 16 00:09:12 crc kubenswrapper[4816]: E0316 00:09:12.668086 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 16 00:09:12 crc kubenswrapper[4816]: E0316 00:09:12.668069 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 16 00:09:12 crc kubenswrapper[4816]: I0316 00:09:12.668247 4816 scope.go:117] "RemoveContainer" containerID="cf50ee7a67dc259915505ac9bbca0a709a016452caaeb0ffa14e5a5947db134e" Mar 16 00:09:12 crc kubenswrapper[4816]: E0316 00:09:12.668261 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jqsjn" podUID="84360ef9-0450-44c5-80eb-eab1bf8e808b" Mar 16 00:09:12 crc kubenswrapper[4816]: E0316 00:09:12.668337 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 16 00:09:12 crc kubenswrapper[4816]: E0316 00:09:12.668542 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-psjs7_openshift-ovn-kubernetes(2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88)\"" pod="openshift-ovn-kubernetes/ovnkube-node-psjs7" podUID="2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88" Mar 16 00:09:12 crc kubenswrapper[4816]: E0316 00:09:12.791374 4816 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 16 00:09:14 crc kubenswrapper[4816]: I0316 00:09:14.667479 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jqsjn" Mar 16 00:09:14 crc kubenswrapper[4816]: I0316 00:09:14.667524 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 16 00:09:14 crc kubenswrapper[4816]: I0316 00:09:14.667529 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 16 00:09:14 crc kubenswrapper[4816]: I0316 00:09:14.667614 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 16 00:09:14 crc kubenswrapper[4816]: E0316 00:09:14.669297 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jqsjn" podUID="84360ef9-0450-44c5-80eb-eab1bf8e808b" Mar 16 00:09:14 crc kubenswrapper[4816]: E0316 00:09:14.669442 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 16 00:09:14 crc kubenswrapper[4816]: E0316 00:09:14.669588 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 16 00:09:14 crc kubenswrapper[4816]: E0316 00:09:14.669663 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 16 00:09:14 crc kubenswrapper[4816]: I0316 00:09:14.684674 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc"] Mar 16 00:09:16 crc kubenswrapper[4816]: I0316 00:09:16.666956 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 16 00:09:16 crc kubenswrapper[4816]: E0316 00:09:16.667197 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 16 00:09:16 crc kubenswrapper[4816]: I0316 00:09:16.667684 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jqsjn" Mar 16 00:09:16 crc kubenswrapper[4816]: E0316 00:09:16.667829 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jqsjn" podUID="84360ef9-0450-44c5-80eb-eab1bf8e808b" Mar 16 00:09:16 crc kubenswrapper[4816]: I0316 00:09:16.667885 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 16 00:09:16 crc kubenswrapper[4816]: E0316 00:09:16.667964 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 16 00:09:16 crc kubenswrapper[4816]: I0316 00:09:16.669475 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 16 00:09:16 crc kubenswrapper[4816]: E0316 00:09:16.669708 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 16 00:09:17 crc kubenswrapper[4816]: I0316 00:09:17.167542 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/84360ef9-0450-44c5-80eb-eab1bf8e808b-metrics-certs\") pod \"network-metrics-daemon-jqsjn\" (UID: \"84360ef9-0450-44c5-80eb-eab1bf8e808b\") " pod="openshift-multus/network-metrics-daemon-jqsjn" Mar 16 00:09:17 crc kubenswrapper[4816]: E0316 00:09:17.167839 4816 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 16 00:09:17 crc kubenswrapper[4816]: E0316 00:09:17.168008 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/84360ef9-0450-44c5-80eb-eab1bf8e808b-metrics-certs podName:84360ef9-0450-44c5-80eb-eab1bf8e808b nodeName:}" failed. No retries permitted until 2026-03-16 00:09:49.167952979 +0000 UTC m=+182.264252972 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/84360ef9-0450-44c5-80eb-eab1bf8e808b-metrics-certs") pod "network-metrics-daemon-jqsjn" (UID: "84360ef9-0450-44c5-80eb-eab1bf8e808b") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 16 00:09:17 crc kubenswrapper[4816]: I0316 00:09:17.681709 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"580d9565-d55c-4108-af2d-07ab6fa7ea2e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f2a41f624c9d40a637cf958648b795587652d7139d4b7cea2e8c70d1cdf05dd6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://59fcd3602ca5395b4f9b934768cad48d5504480af46a1fff8203eb46415b8769\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://59fcd3602ca5395b4f9b934768cad48d5504480af46a1fff8203eb46415b8769\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:06:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:17Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:17 crc kubenswrapper[4816]: I0316 00:09:17.701732 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://723654bc91ce4d40a27b14a9c4df1ed6c8dff2517f378927d7a6a96aeaa6951b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:17Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:17 crc kubenswrapper[4816]: I0316 00:09:17.720774 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://37869baff974556c22aadadfa4b528b5d6b9ad236107b4dbc945b3cdbf743f20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f9fe727708fdaca68596b7d305629ade45b061a88a81085f885d490ab99e24b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:17Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:17 crc kubenswrapper[4816]: I0316 00:09:17.740257 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-cnhkf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e686cd4-bddf-463e-b471-e49ea862691e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f6c3cacd4d7f6866e66b7234b280540ef2c443531de7a11f6894323ef24a9fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bwzt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:08:31Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-cnhkf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:17Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:17 crc kubenswrapper[4816]: I0316 00:09:17.756815 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-kbgw5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7b28986d-e33b-4876-ab6d-64d69960fb8b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://544bab5013b513f755dd9fa88cbc71021b0cfe823aa7508af994c0b1342c3646\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zgr7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0beb1aba8b8813abd98809200fa4dd348377427596c2935ceb012361634df2b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zgr7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:08:44Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-kbgw5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:17Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:17 crc kubenswrapper[4816]: I0316 00:09:17.779228 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c8786d1-025d-4788-bafe-c2c2eaf8e398\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:09:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:09:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://46db23088e8ba791d736f70a65bd066af80a5ec27c31332d9ac9c52530b21bfd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://da7b257af12b9ee605dc32a26d9565f866a9cafebb4936a0bde7f42ae9909f80\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2523ea43f2490afb81354be8a2840f54a3be1a8d3154196e0c8ac365d09723a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e7f3b9b1fc1648908bbf44f9ad32e1eb510b68336a8b60c8417df2f622a36e5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://29c4319e41bd025417b11cb87b682a66698ba7575801f247b1192dc8067ab66f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-16T00:07:59Z\\\",\\\"message\\\":\\\"le observer\\\\nW0316 00:07:59.373922 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0316 00:07:59.374075 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0316 00:07:59.374860 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2696044728/tls.crt::/tmp/serving-cert-2696044728/tls.key\\\\\\\"\\\\nI0316 00:07:59.560928 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0316 00:07:59.563996 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0316 00:07:59.564013 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0316 00:07:59.564035 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0316 00:07:59.564041 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0316 00:07:59.571475 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0316 00:07:59.571523 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0316 00:07:59.571531 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0316 00:07:59.571540 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0316 00:07:59.571574 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0316 00:07:59.571588 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nI0316 00:07:59.571493 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0316 00:07:59.571597 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0316 00:07:59.573484 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-16T00:07:58Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c81d0c40be9c3912864280cd98481f837ef29f69d85599b39decf5e9d7a97eff\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:50Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e39d816de31fa1c42a254ec61527496e7e85121f67007a774524167b1063cf1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e39d816de31fa1c42a254ec61527496e7e85121f67007a774524167b1063cf1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:06:47Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:17Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:17 crc kubenswrapper[4816]: E0316 00:09:17.792883 4816 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 16 00:09:17 crc kubenswrapper[4816]: I0316 00:09:17.800838 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://516852108b3d17a15676573ff080d3d94fa48d8076ed0404a8d827e715c2555e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:17Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:17 crc kubenswrapper[4816]: I0316 00:09:17.823865 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:17Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:17 crc kubenswrapper[4816]: I0316 00:09:17.847080 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-mt7bq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"03ef49f1-0c6a-443a-8df3-2db339c562ed\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d31459f93f54cc1cd99638b1f108b8997b34f246cf8c65e5462e672f04aaa7c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8979f751ec4797986924a3efea84d02bc1c1cf303e1e992f349c9083f2fe4f08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8979f751ec4797986924a3efea84d02bc1c1cf303e1e992f349c9083f2fe4f08\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa87691b0dc314209d88d1f4d5b2227abc1de527e8111831bb8243ba6192c300\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fa87691b0dc314209d88d1f4d5b2227abc1de527e8111831bb8243ba6192c300\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:08:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://33f82b993e3d729a5830e88136e74687c930d6ad1a15c64f003f311fda418bce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://33f82b993e3d729a5830e88136e74687c930d6ad1a15c64f003f311fda418bce\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:08:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b640c26797535db11051d50aefd13b7b2759cfa19d163f53764b0fe8710cefdc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b640c26797535db11051d50aefd13b7b2759cfa19d163f53764b0fe8710cefdc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:08:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe5359762a36b39c9f8ae939a1b9d094c42ffab59c4b9743a83b4391c171a512\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fe5359762a36b39c9f8ae939a1b9d094c42ffab59c4b9743a83b4391c171a512\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:08:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc31f057b776cf1925d265b64c7a9a9fb0993e4f3ffff530a48f28865e1c2954\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dc31f057b776cf1925d265b64c7a9a9fb0993e4f3ffff530a48f28865e1c2954\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:08:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:08:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-mt7bq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:17Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:17 crc kubenswrapper[4816]: I0316 00:09:17.860339 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-lhpbn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6ec6a8ee-efd9-45df-bb35-706fcc90ebe9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a34b33696ef85102800de34359b2f8b4bf11c03ca6347dde766456bcea20e0e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c2x8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:08:38Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-lhpbn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:17Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:17 crc kubenswrapper[4816]: I0316 00:09:17.876507 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7a695b85-75f5-42bb-a13c-aa20512355e4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b6e7e97dbf041ed59e094f9be34956cb28d4774022777f2371a2eb752937f551\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fea427361067bc1a9d56b7b6699b072b5cdeee8345bf6618b8d81c6848f62098\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-16T00:07:20Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0316 00:06:49.819775 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0316 00:06:49.823008 1 observer_polling.go:159] Starting file observer\\\\nI0316 00:06:49.860579 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0316 00:06:49.866006 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nF0316 00:07:20.134248 1 cmd.go:179] failed checking apiserver connectivity: Get \\\\\\\"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/openshift-kube-controller-manager/leases/cluster-policy-controller-lock\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:07:19Z is after 2026-02-23T05:33:13Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:49Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5f3dfc8f46079b51e52802920a734bf796a00db2cf42501b9f7e202e0e9bc2ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5afc1a568801d941c814fe5e4eb91df609672a9431082d578d600f68f669aa3f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c60304164aa2f8e740ab25d779ef9b0aa66f6acca476bae45a13f4be44e37e33\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:06:47Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:17Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:17 crc kubenswrapper[4816]: I0316 00:09:17.894123 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3a9d51fa-9450-4f18-be16-0a93d64889b4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://94c8e0e7b0ecd4d0f525f55538bff0b01653544be87c923fc152d4d982da7021\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://02c81b3d94044c1f4f344dd8306cacf8db533a662535ddb0a6a4ffc1bb4cf1f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://37dea82cd38ef9cfbda3626b789b21af2ada4b4a58bc2ecc6ce55eda60052277\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e44b0bbe4cebe4e52726f3f654f8b4a2953abcd0ad32db752c4641b3f0be1b00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e44b0bbe4cebe4e52726f3f654f8b4a2953abcd0ad32db752c4641b3f0be1b00\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:48Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:06:47Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:17Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:17 crc kubenswrapper[4816]: I0316 00:09:17.911802 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:17Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:17 crc kubenswrapper[4816]: I0316 00:09:17.928903 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:17Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:17 crc kubenswrapper[4816]: I0316 00:09:17.943638 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-jqsjn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"84360ef9-0450-44c5-80eb-eab1bf8e808b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxldb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxldb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:08:45Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-jqsjn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:17Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:17 crc kubenswrapper[4816]: I0316 00:09:17.975203 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"49e35ef6-7a6d-438f-b598-4445d73db379\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fbe90b49c5d42bb9d6d5d1b8f1d02b571403dd3f39d3118081351d72c4910914\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://508e6dbe6f2f1392d16501e09ee2ec523a3c2a245cd96c6ab773e26e83b8bb6a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://15b5a3223ff0fb5b96e5b87ee836add66498165759f4d6ed8cb6413f091967e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://603bdcbd4a6af5fda7ca09e4823e0db8d7ec3e3eb38eddf9f062a14b433cb094\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8b061f16a65a27b9a5ef66231376b70fec084479a991b7fdd430957df76da32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b19c46be8bb0c6604bb4f702d14472b782280ce716a3b5fe87a4fa4f1b15b174\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b19c46be8bb0c6604bb4f702d14472b782280ce716a3b5fe87a4fa4f1b15b174\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://baceacd0418c0bbcbd2e875008286443b263336ba956d8e4584147ed949b3f69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://baceacd0418c0bbcbd2e875008286443b263336ba956d8e4584147ed949b3f69\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:49Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://24e7adc2ddce8a281cd270db47b3528d2a4965ab6d41a14ecb5aaea02daf0c45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://24e7adc2ddce8a281cd270db47b3528d2a4965ab6d41a14ecb5aaea02daf0c45\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:06:47Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:17Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:17 crc kubenswrapper[4816]: I0316 00:09:17.996347 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-szscw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e9789e58-12c8-4831-9401-af48a3e92209\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e6f0b7e8f95bbbf3b90133349b8b13d2784582a5d95dae8dd159328b570ae962\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mxf6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:08:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-szscw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:17Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:18 crc kubenswrapper[4816]: I0316 00:09:18.015003 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-jrdcz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dd08ece2-7636-4966-973a-e96a34b70b53\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6dd22d6548986bf126a8ce6b30c7c11d8c37bd0c03539a55cc28fb7debcfc26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trwbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7003a4592a48f87ab63e9f5637c7665040f9784a7c8f599c82ed61270ddb793c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trwbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:08:31Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-jrdcz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:18Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:18 crc kubenswrapper[4816]: I0316 00:09:18.045910 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-psjs7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f47b6c54d5d72d827208f4d8228be260d9d19b4a720b3174349288dee2916641\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e0a510814d60106574e3cd3e612c031ac900caa0f24a12a62d7cd1f6bfda1ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://826495d8ab8f414169bf36159278da0a040fec699a74b4aebdc8f87ccdc48bd6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa5f052d32ad8f52f33ee5baea5af98a64f05b831b86ebced6c9422fb4845fcf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://86c28f1ae2f2fb61b7755ac14bc7f5346036735c75daccaa0e31b4e606cfb4c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0fac0a986edf984b0e34eda0f969205d5bc92f23d0edacb3c6dd8721eaed2fb9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf50ee7a67dc259915505ac9bbca0a709a016452caaeb0ffa14e5a5947db134e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cf50ee7a67dc259915505ac9bbca0a709a016452caaeb0ffa14e5a5947db134e\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-16T00:08:57Z\\\",\\\"message\\\":\\\"} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:57Z is after 2025-08-24T17:21:41Z]\\\\nI0316 00:08:57.626466 7110 services_controller.go:356] Processing sync for service openshift-machine-config-operator/machine-config-controller for network=default\\\\nI0316 00:08:57.626473 7110 services_controller.go:473] Services do not match for network=default, existing lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-kube-storage-version-migrator-operator/metrics_TCP_cluster\\\\\\\", UUID:\\\\\\\"7f9b8f25-db1a-4d02-a423-9afc5c2fb83c\\\\\\\", Protocol:\\\\\\\"tcp\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-kube-storage-version-migrator-operator/metrics\\\\\\\"}, Opts:services.LBOpts{Reject:false, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}},\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:56Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-psjs7_openshift-ovn-kubernetes(2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4d6bf581280690c99ddbbecd4cf4e215a12230d377978aa48acd3c05b85a3c56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1249f8b1c01db98230c10bde49bdb68f17caccc1ba0546e462df4384ca1658dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1249f8b1c01db98230c10bde49bdb68f17caccc1ba0546e462df4384ca1658dd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:08:31Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-psjs7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:18Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:18 crc kubenswrapper[4816]: I0316 00:09:18.667206 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 16 00:09:18 crc kubenswrapper[4816]: I0316 00:09:18.667342 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jqsjn" Mar 16 00:09:18 crc kubenswrapper[4816]: E0316 00:09:18.667404 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 16 00:09:18 crc kubenswrapper[4816]: I0316 00:09:18.667223 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 16 00:09:18 crc kubenswrapper[4816]: I0316 00:09:18.667342 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 16 00:09:18 crc kubenswrapper[4816]: E0316 00:09:18.667585 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jqsjn" podUID="84360ef9-0450-44c5-80eb-eab1bf8e808b" Mar 16 00:09:18 crc kubenswrapper[4816]: E0316 00:09:18.667752 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 16 00:09:18 crc kubenswrapper[4816]: E0316 00:09:18.667887 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 16 00:09:19 crc kubenswrapper[4816]: I0316 00:09:19.420044 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-szscw_e9789e58-12c8-4831-9401-af48a3e92209/kube-multus/0.log" Mar 16 00:09:19 crc kubenswrapper[4816]: I0316 00:09:19.420121 4816 generic.go:334] "Generic (PLEG): container finished" podID="e9789e58-12c8-4831-9401-af48a3e92209" containerID="e6f0b7e8f95bbbf3b90133349b8b13d2784582a5d95dae8dd159328b570ae962" exitCode=1 Mar 16 00:09:19 crc kubenswrapper[4816]: I0316 00:09:19.420172 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-szscw" event={"ID":"e9789e58-12c8-4831-9401-af48a3e92209","Type":"ContainerDied","Data":"e6f0b7e8f95bbbf3b90133349b8b13d2784582a5d95dae8dd159328b570ae962"} Mar 16 00:09:19 crc kubenswrapper[4816]: I0316 00:09:19.420856 4816 scope.go:117] "RemoveContainer" containerID="e6f0b7e8f95bbbf3b90133349b8b13d2784582a5d95dae8dd159328b570ae962" Mar 16 00:09:19 crc kubenswrapper[4816]: I0316 00:09:19.441478 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"580d9565-d55c-4108-af2d-07ab6fa7ea2e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f2a41f624c9d40a637cf958648b795587652d7139d4b7cea2e8c70d1cdf05dd6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://59fcd3602ca5395b4f9b934768cad48d5504480af46a1fff8203eb46415b8769\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://59fcd3602ca5395b4f9b934768cad48d5504480af46a1fff8203eb46415b8769\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:06:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:19Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:19 crc kubenswrapper[4816]: I0316 00:09:19.461956 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://723654bc91ce4d40a27b14a9c4df1ed6c8dff2517f378927d7a6a96aeaa6951b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:19Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:19 crc kubenswrapper[4816]: I0316 00:09:19.480680 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://37869baff974556c22aadadfa4b528b5d6b9ad236107b4dbc945b3cdbf743f20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f9fe727708fdaca68596b7d305629ade45b061a88a81085f885d490ab99e24b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:19Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:19 crc kubenswrapper[4816]: I0316 00:09:19.493898 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-cnhkf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e686cd4-bddf-463e-b471-e49ea862691e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f6c3cacd4d7f6866e66b7234b280540ef2c443531de7a11f6894323ef24a9fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bwzt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:08:31Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-cnhkf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:19Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:19 crc kubenswrapper[4816]: I0316 00:09:19.512629 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-kbgw5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7b28986d-e33b-4876-ab6d-64d69960fb8b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://544bab5013b513f755dd9fa88cbc71021b0cfe823aa7508af994c0b1342c3646\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zgr7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0beb1aba8b8813abd98809200fa4dd348377427596c2935ceb012361634df2b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zgr7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:08:44Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-kbgw5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:19Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:19 crc kubenswrapper[4816]: I0316 00:09:19.536450 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c8786d1-025d-4788-bafe-c2c2eaf8e398\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:09:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:09:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://46db23088e8ba791d736f70a65bd066af80a5ec27c31332d9ac9c52530b21bfd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://da7b257af12b9ee605dc32a26d9565f866a9cafebb4936a0bde7f42ae9909f80\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2523ea43f2490afb81354be8a2840f54a3be1a8d3154196e0c8ac365d09723a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e7f3b9b1fc1648908bbf44f9ad32e1eb510b68336a8b60c8417df2f622a36e5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://29c4319e41bd025417b11cb87b682a66698ba7575801f247b1192dc8067ab66f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-16T00:07:59Z\\\",\\\"message\\\":\\\"le observer\\\\nW0316 00:07:59.373922 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0316 00:07:59.374075 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0316 00:07:59.374860 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2696044728/tls.crt::/tmp/serving-cert-2696044728/tls.key\\\\\\\"\\\\nI0316 00:07:59.560928 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0316 00:07:59.563996 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0316 00:07:59.564013 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0316 00:07:59.564035 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0316 00:07:59.564041 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0316 00:07:59.571475 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0316 00:07:59.571523 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0316 00:07:59.571531 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0316 00:07:59.571540 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0316 00:07:59.571574 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0316 00:07:59.571588 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nI0316 00:07:59.571493 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0316 00:07:59.571597 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0316 00:07:59.573484 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-16T00:07:58Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c81d0c40be9c3912864280cd98481f837ef29f69d85599b39decf5e9d7a97eff\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:50Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e39d816de31fa1c42a254ec61527496e7e85121f67007a774524167b1063cf1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e39d816de31fa1c42a254ec61527496e7e85121f67007a774524167b1063cf1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:06:47Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:19Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:19 crc kubenswrapper[4816]: I0316 00:09:19.554609 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://516852108b3d17a15676573ff080d3d94fa48d8076ed0404a8d827e715c2555e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:19Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:19 crc kubenswrapper[4816]: I0316 00:09:19.575106 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:19Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:19 crc kubenswrapper[4816]: I0316 00:09:19.591354 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-mt7bq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"03ef49f1-0c6a-443a-8df3-2db339c562ed\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d31459f93f54cc1cd99638b1f108b8997b34f246cf8c65e5462e672f04aaa7c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8979f751ec4797986924a3efea84d02bc1c1cf303e1e992f349c9083f2fe4f08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8979f751ec4797986924a3efea84d02bc1c1cf303e1e992f349c9083f2fe4f08\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa87691b0dc314209d88d1f4d5b2227abc1de527e8111831bb8243ba6192c300\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fa87691b0dc314209d88d1f4d5b2227abc1de527e8111831bb8243ba6192c300\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:08:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://33f82b993e3d729a5830e88136e74687c930d6ad1a15c64f003f311fda418bce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://33f82b993e3d729a5830e88136e74687c930d6ad1a15c64f003f311fda418bce\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:08:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b640c26797535db11051d50aefd13b7b2759cfa19d163f53764b0fe8710cefdc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b640c26797535db11051d50aefd13b7b2759cfa19d163f53764b0fe8710cefdc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:08:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe5359762a36b39c9f8ae939a1b9d094c42ffab59c4b9743a83b4391c171a512\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fe5359762a36b39c9f8ae939a1b9d094c42ffab59c4b9743a83b4391c171a512\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:08:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc31f057b776cf1925d265b64c7a9a9fb0993e4f3ffff530a48f28865e1c2954\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dc31f057b776cf1925d265b64c7a9a9fb0993e4f3ffff530a48f28865e1c2954\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:08:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:08:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-mt7bq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:19Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:19 crc kubenswrapper[4816]: I0316 00:09:19.603847 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-lhpbn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6ec6a8ee-efd9-45df-bb35-706fcc90ebe9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a34b33696ef85102800de34359b2f8b4bf11c03ca6347dde766456bcea20e0e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c2x8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:08:38Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-lhpbn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:19Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:19 crc kubenswrapper[4816]: I0316 00:09:19.621963 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7a695b85-75f5-42bb-a13c-aa20512355e4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b6e7e97dbf041ed59e094f9be34956cb28d4774022777f2371a2eb752937f551\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fea427361067bc1a9d56b7b6699b072b5cdeee8345bf6618b8d81c6848f62098\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-16T00:07:20Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0316 00:06:49.819775 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0316 00:06:49.823008 1 observer_polling.go:159] Starting file observer\\\\nI0316 00:06:49.860579 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0316 00:06:49.866006 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nF0316 00:07:20.134248 1 cmd.go:179] failed checking apiserver connectivity: Get \\\\\\\"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/openshift-kube-controller-manager/leases/cluster-policy-controller-lock\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:07:19Z is after 2026-02-23T05:33:13Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:49Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5f3dfc8f46079b51e52802920a734bf796a00db2cf42501b9f7e202e0e9bc2ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5afc1a568801d941c814fe5e4eb91df609672a9431082d578d600f68f669aa3f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c60304164aa2f8e740ab25d779ef9b0aa66f6acca476bae45a13f4be44e37e33\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:06:47Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:19Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:19 crc kubenswrapper[4816]: I0316 00:09:19.638897 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3a9d51fa-9450-4f18-be16-0a93d64889b4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://94c8e0e7b0ecd4d0f525f55538bff0b01653544be87c923fc152d4d982da7021\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://02c81b3d94044c1f4f344dd8306cacf8db533a662535ddb0a6a4ffc1bb4cf1f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://37dea82cd38ef9cfbda3626b789b21af2ada4b4a58bc2ecc6ce55eda60052277\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e44b0bbe4cebe4e52726f3f654f8b4a2953abcd0ad32db752c4641b3f0be1b00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e44b0bbe4cebe4e52726f3f654f8b4a2953abcd0ad32db752c4641b3f0be1b00\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:48Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:06:47Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:19Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:19 crc kubenswrapper[4816]: I0316 00:09:19.652019 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:19Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:19 crc kubenswrapper[4816]: I0316 00:09:19.670059 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:19Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:19 crc kubenswrapper[4816]: I0316 00:09:19.685254 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-jqsjn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"84360ef9-0450-44c5-80eb-eab1bf8e808b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxldb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxldb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:08:45Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-jqsjn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:19Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:19 crc kubenswrapper[4816]: I0316 00:09:19.709933 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"49e35ef6-7a6d-438f-b598-4445d73db379\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fbe90b49c5d42bb9d6d5d1b8f1d02b571403dd3f39d3118081351d72c4910914\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://508e6dbe6f2f1392d16501e09ee2ec523a3c2a245cd96c6ab773e26e83b8bb6a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://15b5a3223ff0fb5b96e5b87ee836add66498165759f4d6ed8cb6413f091967e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://603bdcbd4a6af5fda7ca09e4823e0db8d7ec3e3eb38eddf9f062a14b433cb094\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8b061f16a65a27b9a5ef66231376b70fec084479a991b7fdd430957df76da32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b19c46be8bb0c6604bb4f702d14472b782280ce716a3b5fe87a4fa4f1b15b174\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b19c46be8bb0c6604bb4f702d14472b782280ce716a3b5fe87a4fa4f1b15b174\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://baceacd0418c0bbcbd2e875008286443b263336ba956d8e4584147ed949b3f69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://baceacd0418c0bbcbd2e875008286443b263336ba956d8e4584147ed949b3f69\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:49Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://24e7adc2ddce8a281cd270db47b3528d2a4965ab6d41a14ecb5aaea02daf0c45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://24e7adc2ddce8a281cd270db47b3528d2a4965ab6d41a14ecb5aaea02daf0c45\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:06:47Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:19Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:19 crc kubenswrapper[4816]: I0316 00:09:19.726870 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-szscw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e9789e58-12c8-4831-9401-af48a3e92209\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:09:19Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:09:19Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e6f0b7e8f95bbbf3b90133349b8b13d2784582a5d95dae8dd159328b570ae962\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e6f0b7e8f95bbbf3b90133349b8b13d2784582a5d95dae8dd159328b570ae962\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-16T00:09:18Z\\\",\\\"message\\\":\\\"2026-03-16T00:08:33+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_c9e0d937-8428-483b-a1c0-f9ef411a7cbe\\\\n2026-03-16T00:08:33+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_c9e0d937-8428-483b-a1c0-f9ef411a7cbe to /host/opt/cni/bin/\\\\n2026-03-16T00:08:33Z [verbose] multus-daemon started\\\\n2026-03-16T00:08:33Z [verbose] Readiness Indicator file check\\\\n2026-03-16T00:09:18Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mxf6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:08:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-szscw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:19Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:19 crc kubenswrapper[4816]: I0316 00:09:19.741891 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-jrdcz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dd08ece2-7636-4966-973a-e96a34b70b53\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6dd22d6548986bf126a8ce6b30c7c11d8c37bd0c03539a55cc28fb7debcfc26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trwbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7003a4592a48f87ab63e9f5637c7665040f9784a7c8f599c82ed61270ddb793c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trwbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:08:31Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-jrdcz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:19Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:19 crc kubenswrapper[4816]: I0316 00:09:19.769423 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-psjs7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f47b6c54d5d72d827208f4d8228be260d9d19b4a720b3174349288dee2916641\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e0a510814d60106574e3cd3e612c031ac900caa0f24a12a62d7cd1f6bfda1ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://826495d8ab8f414169bf36159278da0a040fec699a74b4aebdc8f87ccdc48bd6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa5f052d32ad8f52f33ee5baea5af98a64f05b831b86ebced6c9422fb4845fcf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://86c28f1ae2f2fb61b7755ac14bc7f5346036735c75daccaa0e31b4e606cfb4c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0fac0a986edf984b0e34eda0f969205d5bc92f23d0edacb3c6dd8721eaed2fb9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf50ee7a67dc259915505ac9bbca0a709a016452caaeb0ffa14e5a5947db134e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cf50ee7a67dc259915505ac9bbca0a709a016452caaeb0ffa14e5a5947db134e\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-16T00:08:57Z\\\",\\\"message\\\":\\\"} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:57Z is after 2025-08-24T17:21:41Z]\\\\nI0316 00:08:57.626466 7110 services_controller.go:356] Processing sync for service openshift-machine-config-operator/machine-config-controller for network=default\\\\nI0316 00:08:57.626473 7110 services_controller.go:473] Services do not match for network=default, existing lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-kube-storage-version-migrator-operator/metrics_TCP_cluster\\\\\\\", UUID:\\\\\\\"7f9b8f25-db1a-4d02-a423-9afc5c2fb83c\\\\\\\", Protocol:\\\\\\\"tcp\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-kube-storage-version-migrator-operator/metrics\\\\\\\"}, Opts:services.LBOpts{Reject:false, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}},\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:56Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-psjs7_openshift-ovn-kubernetes(2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4d6bf581280690c99ddbbecd4cf4e215a12230d377978aa48acd3c05b85a3c56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1249f8b1c01db98230c10bde49bdb68f17caccc1ba0546e462df4384ca1658dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1249f8b1c01db98230c10bde49bdb68f17caccc1ba0546e462df4384ca1658dd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:08:31Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-psjs7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:19Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:20 crc kubenswrapper[4816]: I0316 00:09:20.135672 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:09:20 crc kubenswrapper[4816]: I0316 00:09:20.135733 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:09:20 crc kubenswrapper[4816]: I0316 00:09:20.135750 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:09:20 crc kubenswrapper[4816]: I0316 00:09:20.135780 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:09:20 crc kubenswrapper[4816]: I0316 00:09:20.135797 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:09:20Z","lastTransitionTime":"2026-03-16T00:09:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:09:20 crc kubenswrapper[4816]: E0316 00:09:20.158421 4816 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:09:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:09:20Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:09:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:09:20Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:09:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:09:20Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:09:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:09:20Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d8fb348c-4907-4c8f-859a-735976530e03\\\",\\\"systemUUID\\\":\\\"e97abf98-298f-4589-a70a-4cfb5cb2994a\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:20Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:20 crc kubenswrapper[4816]: I0316 00:09:20.164026 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:09:20 crc kubenswrapper[4816]: I0316 00:09:20.164081 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:09:20 crc kubenswrapper[4816]: I0316 00:09:20.164117 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:09:20 crc kubenswrapper[4816]: I0316 00:09:20.164146 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:09:20 crc kubenswrapper[4816]: I0316 00:09:20.164166 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:09:20Z","lastTransitionTime":"2026-03-16T00:09:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:09:20 crc kubenswrapper[4816]: E0316 00:09:20.186852 4816 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:09:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:09:20Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:09:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:09:20Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:09:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:09:20Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:09:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:09:20Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d8fb348c-4907-4c8f-859a-735976530e03\\\",\\\"systemUUID\\\":\\\"e97abf98-298f-4589-a70a-4cfb5cb2994a\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:20Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:20 crc kubenswrapper[4816]: I0316 00:09:20.191717 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:09:20 crc kubenswrapper[4816]: I0316 00:09:20.191763 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:09:20 crc kubenswrapper[4816]: I0316 00:09:20.191779 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:09:20 crc kubenswrapper[4816]: I0316 00:09:20.191802 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:09:20 crc kubenswrapper[4816]: I0316 00:09:20.191819 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:09:20Z","lastTransitionTime":"2026-03-16T00:09:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:09:20 crc kubenswrapper[4816]: E0316 00:09:20.212081 4816 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:09:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:09:20Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:09:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:09:20Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:09:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:09:20Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:09:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:09:20Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d8fb348c-4907-4c8f-859a-735976530e03\\\",\\\"systemUUID\\\":\\\"e97abf98-298f-4589-a70a-4cfb5cb2994a\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:20Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:20 crc kubenswrapper[4816]: I0316 00:09:20.216064 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:09:20 crc kubenswrapper[4816]: I0316 00:09:20.216143 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:09:20 crc kubenswrapper[4816]: I0316 00:09:20.216166 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:09:20 crc kubenswrapper[4816]: I0316 00:09:20.216191 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:09:20 crc kubenswrapper[4816]: I0316 00:09:20.216207 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:09:20Z","lastTransitionTime":"2026-03-16T00:09:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:09:20 crc kubenswrapper[4816]: E0316 00:09:20.235703 4816 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:09:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:09:20Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:09:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:09:20Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:09:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:09:20Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:09:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:09:20Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d8fb348c-4907-4c8f-859a-735976530e03\\\",\\\"systemUUID\\\":\\\"e97abf98-298f-4589-a70a-4cfb5cb2994a\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:20Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:20 crc kubenswrapper[4816]: I0316 00:09:20.240468 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:09:20 crc kubenswrapper[4816]: I0316 00:09:20.240545 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:09:20 crc kubenswrapper[4816]: I0316 00:09:20.240615 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:09:20 crc kubenswrapper[4816]: I0316 00:09:20.240659 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:09:20 crc kubenswrapper[4816]: I0316 00:09:20.240678 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:09:20Z","lastTransitionTime":"2026-03-16T00:09:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:09:20 crc kubenswrapper[4816]: E0316 00:09:20.267532 4816 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:09:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:09:20Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:09:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:09:20Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:09:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:09:20Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:09:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:09:20Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d8fb348c-4907-4c8f-859a-735976530e03\\\",\\\"systemUUID\\\":\\\"e97abf98-298f-4589-a70a-4cfb5cb2994a\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:20Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:20 crc kubenswrapper[4816]: E0316 00:09:20.267833 4816 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 16 00:09:20 crc kubenswrapper[4816]: I0316 00:09:20.427757 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-szscw_e9789e58-12c8-4831-9401-af48a3e92209/kube-multus/0.log" Mar 16 00:09:20 crc kubenswrapper[4816]: I0316 00:09:20.427858 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-szscw" event={"ID":"e9789e58-12c8-4831-9401-af48a3e92209","Type":"ContainerStarted","Data":"b45d6058e72f7117fdfb86b3480de77c8592dba7dcd7932b2a5c34036da7af26"} Mar 16 00:09:20 crc kubenswrapper[4816]: I0316 00:09:20.452810 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://37869baff974556c22aadadfa4b528b5d6b9ad236107b4dbc945b3cdbf743f20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f9fe727708fdaca68596b7d305629ade45b061a88a81085f885d490ab99e24b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:20Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:20 crc kubenswrapper[4816]: I0316 00:09:20.470619 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-cnhkf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e686cd4-bddf-463e-b471-e49ea862691e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f6c3cacd4d7f6866e66b7234b280540ef2c443531de7a11f6894323ef24a9fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bwzt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:08:31Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-cnhkf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:20Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:20 crc kubenswrapper[4816]: I0316 00:09:20.488756 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-kbgw5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7b28986d-e33b-4876-ab6d-64d69960fb8b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://544bab5013b513f755dd9fa88cbc71021b0cfe823aa7508af994c0b1342c3646\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zgr7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0beb1aba8b8813abd98809200fa4dd348377427596c2935ceb012361634df2b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zgr7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:08:44Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-kbgw5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:20Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:20 crc kubenswrapper[4816]: I0316 00:09:20.505908 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"580d9565-d55c-4108-af2d-07ab6fa7ea2e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f2a41f624c9d40a637cf958648b795587652d7139d4b7cea2e8c70d1cdf05dd6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://59fcd3602ca5395b4f9b934768cad48d5504480af46a1fff8203eb46415b8769\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://59fcd3602ca5395b4f9b934768cad48d5504480af46a1fff8203eb46415b8769\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:06:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:20Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:20 crc kubenswrapper[4816]: I0316 00:09:20.527830 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://723654bc91ce4d40a27b14a9c4df1ed6c8dff2517f378927d7a6a96aeaa6951b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:20Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:20 crc kubenswrapper[4816]: I0316 00:09:20.548582 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:20Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:20 crc kubenswrapper[4816]: I0316 00:09:20.583416 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-mt7bq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"03ef49f1-0c6a-443a-8df3-2db339c562ed\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d31459f93f54cc1cd99638b1f108b8997b34f246cf8c65e5462e672f04aaa7c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8979f751ec4797986924a3efea84d02bc1c1cf303e1e992f349c9083f2fe4f08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8979f751ec4797986924a3efea84d02bc1c1cf303e1e992f349c9083f2fe4f08\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa87691b0dc314209d88d1f4d5b2227abc1de527e8111831bb8243ba6192c300\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fa87691b0dc314209d88d1f4d5b2227abc1de527e8111831bb8243ba6192c300\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:08:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://33f82b993e3d729a5830e88136e74687c930d6ad1a15c64f003f311fda418bce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://33f82b993e3d729a5830e88136e74687c930d6ad1a15c64f003f311fda418bce\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:08:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b640c26797535db11051d50aefd13b7b2759cfa19d163f53764b0fe8710cefdc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b640c26797535db11051d50aefd13b7b2759cfa19d163f53764b0fe8710cefdc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:08:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe5359762a36b39c9f8ae939a1b9d094c42ffab59c4b9743a83b4391c171a512\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fe5359762a36b39c9f8ae939a1b9d094c42ffab59c4b9743a83b4391c171a512\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:08:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc31f057b776cf1925d265b64c7a9a9fb0993e4f3ffff530a48f28865e1c2954\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dc31f057b776cf1925d265b64c7a9a9fb0993e4f3ffff530a48f28865e1c2954\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:08:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:08:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-mt7bq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:20Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:20 crc kubenswrapper[4816]: I0316 00:09:20.595691 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-lhpbn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6ec6a8ee-efd9-45df-bb35-706fcc90ebe9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a34b33696ef85102800de34359b2f8b4bf11c03ca6347dde766456bcea20e0e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c2x8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:08:38Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-lhpbn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:20Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:20 crc kubenswrapper[4816]: I0316 00:09:20.611623 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c8786d1-025d-4788-bafe-c2c2eaf8e398\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:09:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:09:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://46db23088e8ba791d736f70a65bd066af80a5ec27c31332d9ac9c52530b21bfd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://da7b257af12b9ee605dc32a26d9565f866a9cafebb4936a0bde7f42ae9909f80\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2523ea43f2490afb81354be8a2840f54a3be1a8d3154196e0c8ac365d09723a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e7f3b9b1fc1648908bbf44f9ad32e1eb510b68336a8b60c8417df2f622a36e5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://29c4319e41bd025417b11cb87b682a66698ba7575801f247b1192dc8067ab66f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-16T00:07:59Z\\\",\\\"message\\\":\\\"le observer\\\\nW0316 00:07:59.373922 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0316 00:07:59.374075 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0316 00:07:59.374860 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2696044728/tls.crt::/tmp/serving-cert-2696044728/tls.key\\\\\\\"\\\\nI0316 00:07:59.560928 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0316 00:07:59.563996 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0316 00:07:59.564013 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0316 00:07:59.564035 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0316 00:07:59.564041 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0316 00:07:59.571475 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0316 00:07:59.571523 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0316 00:07:59.571531 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0316 00:07:59.571540 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0316 00:07:59.571574 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0316 00:07:59.571588 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nI0316 00:07:59.571493 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0316 00:07:59.571597 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0316 00:07:59.573484 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-16T00:07:58Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c81d0c40be9c3912864280cd98481f837ef29f69d85599b39decf5e9d7a97eff\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:50Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e39d816de31fa1c42a254ec61527496e7e85121f67007a774524167b1063cf1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e39d816de31fa1c42a254ec61527496e7e85121f67007a774524167b1063cf1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:06:47Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:20Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:20 crc kubenswrapper[4816]: I0316 00:09:20.625839 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://516852108b3d17a15676573ff080d3d94fa48d8076ed0404a8d827e715c2555e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:20Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:20 crc kubenswrapper[4816]: I0316 00:09:20.644696 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:20Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:20 crc kubenswrapper[4816]: I0316 00:09:20.659686 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:20Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:20 crc kubenswrapper[4816]: I0316 00:09:20.667536 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 16 00:09:20 crc kubenswrapper[4816]: I0316 00:09:20.667615 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 16 00:09:20 crc kubenswrapper[4816]: I0316 00:09:20.667654 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 16 00:09:20 crc kubenswrapper[4816]: I0316 00:09:20.667569 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jqsjn" Mar 16 00:09:20 crc kubenswrapper[4816]: E0316 00:09:20.667763 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 16 00:09:20 crc kubenswrapper[4816]: E0316 00:09:20.667906 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 16 00:09:20 crc kubenswrapper[4816]: E0316 00:09:20.668065 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jqsjn" podUID="84360ef9-0450-44c5-80eb-eab1bf8e808b" Mar 16 00:09:20 crc kubenswrapper[4816]: E0316 00:09:20.668186 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 16 00:09:20 crc kubenswrapper[4816]: I0316 00:09:20.672770 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-jqsjn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"84360ef9-0450-44c5-80eb-eab1bf8e808b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxldb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxldb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:08:45Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-jqsjn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:20Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:20 crc kubenswrapper[4816]: I0316 00:09:20.687852 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7a695b85-75f5-42bb-a13c-aa20512355e4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b6e7e97dbf041ed59e094f9be34956cb28d4774022777f2371a2eb752937f551\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fea427361067bc1a9d56b7b6699b072b5cdeee8345bf6618b8d81c6848f62098\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-16T00:07:20Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0316 00:06:49.819775 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0316 00:06:49.823008 1 observer_polling.go:159] Starting file observer\\\\nI0316 00:06:49.860579 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0316 00:06:49.866006 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nF0316 00:07:20.134248 1 cmd.go:179] failed checking apiserver connectivity: Get \\\\\\\"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/openshift-kube-controller-manager/leases/cluster-policy-controller-lock\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:07:19Z is after 2026-02-23T05:33:13Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:49Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5f3dfc8f46079b51e52802920a734bf796a00db2cf42501b9f7e202e0e9bc2ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5afc1a568801d941c814fe5e4eb91df609672a9431082d578d600f68f669aa3f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c60304164aa2f8e740ab25d779ef9b0aa66f6acca476bae45a13f4be44e37e33\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:06:47Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:20Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:20 crc kubenswrapper[4816]: I0316 00:09:20.703972 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3a9d51fa-9450-4f18-be16-0a93d64889b4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://94c8e0e7b0ecd4d0f525f55538bff0b01653544be87c923fc152d4d982da7021\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://02c81b3d94044c1f4f344dd8306cacf8db533a662535ddb0a6a4ffc1bb4cf1f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://37dea82cd38ef9cfbda3626b789b21af2ada4b4a58bc2ecc6ce55eda60052277\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e44b0bbe4cebe4e52726f3f654f8b4a2953abcd0ad32db752c4641b3f0be1b00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e44b0bbe4cebe4e52726f3f654f8b4a2953abcd0ad32db752c4641b3f0be1b00\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:48Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:06:47Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:20Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:20 crc kubenswrapper[4816]: I0316 00:09:20.717538 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-jrdcz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dd08ece2-7636-4966-973a-e96a34b70b53\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6dd22d6548986bf126a8ce6b30c7c11d8c37bd0c03539a55cc28fb7debcfc26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trwbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7003a4592a48f87ab63e9f5637c7665040f9784a7c8f599c82ed61270ddb793c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trwbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:08:31Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-jrdcz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:20Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:20 crc kubenswrapper[4816]: I0316 00:09:20.744393 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-psjs7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f47b6c54d5d72d827208f4d8228be260d9d19b4a720b3174349288dee2916641\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e0a510814d60106574e3cd3e612c031ac900caa0f24a12a62d7cd1f6bfda1ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://826495d8ab8f414169bf36159278da0a040fec699a74b4aebdc8f87ccdc48bd6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa5f052d32ad8f52f33ee5baea5af98a64f05b831b86ebced6c9422fb4845fcf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://86c28f1ae2f2fb61b7755ac14bc7f5346036735c75daccaa0e31b4e606cfb4c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0fac0a986edf984b0e34eda0f969205d5bc92f23d0edacb3c6dd8721eaed2fb9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf50ee7a67dc259915505ac9bbca0a709a016452caaeb0ffa14e5a5947db134e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cf50ee7a67dc259915505ac9bbca0a709a016452caaeb0ffa14e5a5947db134e\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-16T00:08:57Z\\\",\\\"message\\\":\\\"} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:57Z is after 2025-08-24T17:21:41Z]\\\\nI0316 00:08:57.626466 7110 services_controller.go:356] Processing sync for service openshift-machine-config-operator/machine-config-controller for network=default\\\\nI0316 00:08:57.626473 7110 services_controller.go:473] Services do not match for network=default, existing lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-kube-storage-version-migrator-operator/metrics_TCP_cluster\\\\\\\", UUID:\\\\\\\"7f9b8f25-db1a-4d02-a423-9afc5c2fb83c\\\\\\\", Protocol:\\\\\\\"tcp\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-kube-storage-version-migrator-operator/metrics\\\\\\\"}, Opts:services.LBOpts{Reject:false, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}},\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:56Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-psjs7_openshift-ovn-kubernetes(2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4d6bf581280690c99ddbbecd4cf4e215a12230d377978aa48acd3c05b85a3c56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1249f8b1c01db98230c10bde49bdb68f17caccc1ba0546e462df4384ca1658dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1249f8b1c01db98230c10bde49bdb68f17caccc1ba0546e462df4384ca1658dd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:08:31Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-psjs7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:20Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:20 crc kubenswrapper[4816]: I0316 00:09:20.769953 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"49e35ef6-7a6d-438f-b598-4445d73db379\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fbe90b49c5d42bb9d6d5d1b8f1d02b571403dd3f39d3118081351d72c4910914\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://508e6dbe6f2f1392d16501e09ee2ec523a3c2a245cd96c6ab773e26e83b8bb6a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://15b5a3223ff0fb5b96e5b87ee836add66498165759f4d6ed8cb6413f091967e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://603bdcbd4a6af5fda7ca09e4823e0db8d7ec3e3eb38eddf9f062a14b433cb094\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8b061f16a65a27b9a5ef66231376b70fec084479a991b7fdd430957df76da32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b19c46be8bb0c6604bb4f702d14472b782280ce716a3b5fe87a4fa4f1b15b174\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b19c46be8bb0c6604bb4f702d14472b782280ce716a3b5fe87a4fa4f1b15b174\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://baceacd0418c0bbcbd2e875008286443b263336ba956d8e4584147ed949b3f69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://baceacd0418c0bbcbd2e875008286443b263336ba956d8e4584147ed949b3f69\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:49Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://24e7adc2ddce8a281cd270db47b3528d2a4965ab6d41a14ecb5aaea02daf0c45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://24e7adc2ddce8a281cd270db47b3528d2a4965ab6d41a14ecb5aaea02daf0c45\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:06:47Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:20Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:20 crc kubenswrapper[4816]: I0316 00:09:20.786148 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-szscw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e9789e58-12c8-4831-9401-af48a3e92209\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:09:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:09:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b45d6058e72f7117fdfb86b3480de77c8592dba7dcd7932b2a5c34036da7af26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e6f0b7e8f95bbbf3b90133349b8b13d2784582a5d95dae8dd159328b570ae962\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-16T00:09:18Z\\\",\\\"message\\\":\\\"2026-03-16T00:08:33+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_c9e0d937-8428-483b-a1c0-f9ef411a7cbe\\\\n2026-03-16T00:08:33+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_c9e0d937-8428-483b-a1c0-f9ef411a7cbe to /host/opt/cni/bin/\\\\n2026-03-16T00:08:33Z [verbose] multus-daemon started\\\\n2026-03-16T00:08:33Z [verbose] Readiness Indicator file check\\\\n2026-03-16T00:09:18Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:31Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:09:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mxf6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:08:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-szscw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:20Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:22 crc kubenswrapper[4816]: I0316 00:09:22.667255 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 16 00:09:22 crc kubenswrapper[4816]: E0316 00:09:22.667664 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 16 00:09:22 crc kubenswrapper[4816]: I0316 00:09:22.667431 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jqsjn" Mar 16 00:09:22 crc kubenswrapper[4816]: E0316 00:09:22.667800 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jqsjn" podUID="84360ef9-0450-44c5-80eb-eab1bf8e808b" Mar 16 00:09:22 crc kubenswrapper[4816]: I0316 00:09:22.667340 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 16 00:09:22 crc kubenswrapper[4816]: E0316 00:09:22.667862 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 16 00:09:22 crc kubenswrapper[4816]: I0316 00:09:22.667462 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 16 00:09:22 crc kubenswrapper[4816]: E0316 00:09:22.667914 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 16 00:09:22 crc kubenswrapper[4816]: E0316 00:09:22.794914 4816 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 16 00:09:24 crc kubenswrapper[4816]: I0316 00:09:24.666919 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jqsjn" Mar 16 00:09:24 crc kubenswrapper[4816]: I0316 00:09:24.666974 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 16 00:09:24 crc kubenswrapper[4816]: I0316 00:09:24.667038 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 16 00:09:24 crc kubenswrapper[4816]: I0316 00:09:24.667049 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 16 00:09:24 crc kubenswrapper[4816]: E0316 00:09:24.667125 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jqsjn" podUID="84360ef9-0450-44c5-80eb-eab1bf8e808b" Mar 16 00:09:24 crc kubenswrapper[4816]: E0316 00:09:24.667219 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 16 00:09:24 crc kubenswrapper[4816]: E0316 00:09:24.667851 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 16 00:09:24 crc kubenswrapper[4816]: E0316 00:09:24.667994 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 16 00:09:24 crc kubenswrapper[4816]: I0316 00:09:24.669063 4816 scope.go:117] "RemoveContainer" containerID="cf50ee7a67dc259915505ac9bbca0a709a016452caaeb0ffa14e5a5947db134e" Mar 16 00:09:25 crc kubenswrapper[4816]: I0316 00:09:25.452608 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-psjs7_2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88/ovnkube-controller/2.log" Mar 16 00:09:25 crc kubenswrapper[4816]: I0316 00:09:25.455997 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-psjs7" event={"ID":"2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88","Type":"ContainerStarted","Data":"ba7195b779774df29cc722244d9b4290e7f75033f6e135e41288fbb5310a7c3c"} Mar 16 00:09:25 crc kubenswrapper[4816]: I0316 00:09:25.456464 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-psjs7" Mar 16 00:09:25 crc kubenswrapper[4816]: I0316 00:09:25.470464 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-cnhkf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e686cd4-bddf-463e-b471-e49ea862691e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f6c3cacd4d7f6866e66b7234b280540ef2c443531de7a11f6894323ef24a9fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bwzt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:08:31Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-cnhkf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:25Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:25 crc kubenswrapper[4816]: I0316 00:09:25.484096 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-kbgw5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7b28986d-e33b-4876-ab6d-64d69960fb8b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://544bab5013b513f755dd9fa88cbc71021b0cfe823aa7508af994c0b1342c3646\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zgr7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0beb1aba8b8813abd98809200fa4dd348377427596c2935ceb012361634df2b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zgr7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:08:44Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-kbgw5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:25Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:25 crc kubenswrapper[4816]: I0316 00:09:25.495330 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"580d9565-d55c-4108-af2d-07ab6fa7ea2e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f2a41f624c9d40a637cf958648b795587652d7139d4b7cea2e8c70d1cdf05dd6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://59fcd3602ca5395b4f9b934768cad48d5504480af46a1fff8203eb46415b8769\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://59fcd3602ca5395b4f9b934768cad48d5504480af46a1fff8203eb46415b8769\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:06:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:25Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:25 crc kubenswrapper[4816]: I0316 00:09:25.507519 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://723654bc91ce4d40a27b14a9c4df1ed6c8dff2517f378927d7a6a96aeaa6951b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:25Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:25 crc kubenswrapper[4816]: I0316 00:09:25.525009 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://37869baff974556c22aadadfa4b528b5d6b9ad236107b4dbc945b3cdbf743f20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f9fe727708fdaca68596b7d305629ade45b061a88a81085f885d490ab99e24b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:25Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:25 crc kubenswrapper[4816]: I0316 00:09:25.545007 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-mt7bq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"03ef49f1-0c6a-443a-8df3-2db339c562ed\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d31459f93f54cc1cd99638b1f108b8997b34f246cf8c65e5462e672f04aaa7c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8979f751ec4797986924a3efea84d02bc1c1cf303e1e992f349c9083f2fe4f08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8979f751ec4797986924a3efea84d02bc1c1cf303e1e992f349c9083f2fe4f08\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa87691b0dc314209d88d1f4d5b2227abc1de527e8111831bb8243ba6192c300\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fa87691b0dc314209d88d1f4d5b2227abc1de527e8111831bb8243ba6192c300\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:08:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://33f82b993e3d729a5830e88136e74687c930d6ad1a15c64f003f311fda418bce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://33f82b993e3d729a5830e88136e74687c930d6ad1a15c64f003f311fda418bce\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:08:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b640c26797535db11051d50aefd13b7b2759cfa19d163f53764b0fe8710cefdc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b640c26797535db11051d50aefd13b7b2759cfa19d163f53764b0fe8710cefdc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:08:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe5359762a36b39c9f8ae939a1b9d094c42ffab59c4b9743a83b4391c171a512\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fe5359762a36b39c9f8ae939a1b9d094c42ffab59c4b9743a83b4391c171a512\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:08:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc31f057b776cf1925d265b64c7a9a9fb0993e4f3ffff530a48f28865e1c2954\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dc31f057b776cf1925d265b64c7a9a9fb0993e4f3ffff530a48f28865e1c2954\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:08:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:08:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-mt7bq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:25Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:25 crc kubenswrapper[4816]: I0316 00:09:25.555615 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-lhpbn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6ec6a8ee-efd9-45df-bb35-706fcc90ebe9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a34b33696ef85102800de34359b2f8b4bf11c03ca6347dde766456bcea20e0e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c2x8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:08:38Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-lhpbn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:25Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:25 crc kubenswrapper[4816]: I0316 00:09:25.568248 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c8786d1-025d-4788-bafe-c2c2eaf8e398\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:09:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:09:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://46db23088e8ba791d736f70a65bd066af80a5ec27c31332d9ac9c52530b21bfd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://da7b257af12b9ee605dc32a26d9565f866a9cafebb4936a0bde7f42ae9909f80\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2523ea43f2490afb81354be8a2840f54a3be1a8d3154196e0c8ac365d09723a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e7f3b9b1fc1648908bbf44f9ad32e1eb510b68336a8b60c8417df2f622a36e5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://29c4319e41bd025417b11cb87b682a66698ba7575801f247b1192dc8067ab66f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-16T00:07:59Z\\\",\\\"message\\\":\\\"le observer\\\\nW0316 00:07:59.373922 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0316 00:07:59.374075 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0316 00:07:59.374860 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2696044728/tls.crt::/tmp/serving-cert-2696044728/tls.key\\\\\\\"\\\\nI0316 00:07:59.560928 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0316 00:07:59.563996 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0316 00:07:59.564013 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0316 00:07:59.564035 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0316 00:07:59.564041 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0316 00:07:59.571475 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0316 00:07:59.571523 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0316 00:07:59.571531 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0316 00:07:59.571540 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0316 00:07:59.571574 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0316 00:07:59.571588 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nI0316 00:07:59.571493 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0316 00:07:59.571597 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0316 00:07:59.573484 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-16T00:07:58Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c81d0c40be9c3912864280cd98481f837ef29f69d85599b39decf5e9d7a97eff\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:50Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e39d816de31fa1c42a254ec61527496e7e85121f67007a774524167b1063cf1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e39d816de31fa1c42a254ec61527496e7e85121f67007a774524167b1063cf1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:06:47Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:25Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:25 crc kubenswrapper[4816]: I0316 00:09:25.582670 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://516852108b3d17a15676573ff080d3d94fa48d8076ed0404a8d827e715c2555e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:25Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:25 crc kubenswrapper[4816]: I0316 00:09:25.599071 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:25Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:25 crc kubenswrapper[4816]: I0316 00:09:25.617592 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:25Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:25 crc kubenswrapper[4816]: I0316 00:09:25.635416 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-jqsjn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"84360ef9-0450-44c5-80eb-eab1bf8e808b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxldb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxldb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:08:45Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-jqsjn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:25Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:25 crc kubenswrapper[4816]: I0316 00:09:25.650115 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7a695b85-75f5-42bb-a13c-aa20512355e4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b6e7e97dbf041ed59e094f9be34956cb28d4774022777f2371a2eb752937f551\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fea427361067bc1a9d56b7b6699b072b5cdeee8345bf6618b8d81c6848f62098\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-16T00:07:20Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0316 00:06:49.819775 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0316 00:06:49.823008 1 observer_polling.go:159] Starting file observer\\\\nI0316 00:06:49.860579 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0316 00:06:49.866006 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nF0316 00:07:20.134248 1 cmd.go:179] failed checking apiserver connectivity: Get \\\\\\\"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/openshift-kube-controller-manager/leases/cluster-policy-controller-lock\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:07:19Z is after 2026-02-23T05:33:13Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:49Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5f3dfc8f46079b51e52802920a734bf796a00db2cf42501b9f7e202e0e9bc2ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5afc1a568801d941c814fe5e4eb91df609672a9431082d578d600f68f669aa3f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c60304164aa2f8e740ab25d779ef9b0aa66f6acca476bae45a13f4be44e37e33\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:06:47Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:25Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:25 crc kubenswrapper[4816]: I0316 00:09:25.662190 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3a9d51fa-9450-4f18-be16-0a93d64889b4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://94c8e0e7b0ecd4d0f525f55538bff0b01653544be87c923fc152d4d982da7021\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://02c81b3d94044c1f4f344dd8306cacf8db533a662535ddb0a6a4ffc1bb4cf1f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://37dea82cd38ef9cfbda3626b789b21af2ada4b4a58bc2ecc6ce55eda60052277\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e44b0bbe4cebe4e52726f3f654f8b4a2953abcd0ad32db752c4641b3f0be1b00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e44b0bbe4cebe4e52726f3f654f8b4a2953abcd0ad32db752c4641b3f0be1b00\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:48Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:06:47Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:25Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:25 crc kubenswrapper[4816]: I0316 00:09:25.676617 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:25Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:25 crc kubenswrapper[4816]: I0316 00:09:25.697011 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-psjs7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f47b6c54d5d72d827208f4d8228be260d9d19b4a720b3174349288dee2916641\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e0a510814d60106574e3cd3e612c031ac900caa0f24a12a62d7cd1f6bfda1ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://826495d8ab8f414169bf36159278da0a040fec699a74b4aebdc8f87ccdc48bd6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa5f052d32ad8f52f33ee5baea5af98a64f05b831b86ebced6c9422fb4845fcf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://86c28f1ae2f2fb61b7755ac14bc7f5346036735c75daccaa0e31b4e606cfb4c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0fac0a986edf984b0e34eda0f969205d5bc92f23d0edacb3c6dd8721eaed2fb9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba7195b779774df29cc722244d9b4290e7f75033f6e135e41288fbb5310a7c3c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cf50ee7a67dc259915505ac9bbca0a709a016452caaeb0ffa14e5a5947db134e\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-16T00:08:57Z\\\",\\\"message\\\":\\\"} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:57Z is after 2025-08-24T17:21:41Z]\\\\nI0316 00:08:57.626466 7110 services_controller.go:356] Processing sync for service openshift-machine-config-operator/machine-config-controller for network=default\\\\nI0316 00:08:57.626473 7110 services_controller.go:473] Services do not match for network=default, existing lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-kube-storage-version-migrator-operator/metrics_TCP_cluster\\\\\\\", UUID:\\\\\\\"7f9b8f25-db1a-4d02-a423-9afc5c2fb83c\\\\\\\", Protocol:\\\\\\\"tcp\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-kube-storage-version-migrator-operator/metrics\\\\\\\"}, Opts:services.LBOpts{Reject:false, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}},\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:56Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:09:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4d6bf581280690c99ddbbecd4cf4e215a12230d377978aa48acd3c05b85a3c56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1249f8b1c01db98230c10bde49bdb68f17caccc1ba0546e462df4384ca1658dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1249f8b1c01db98230c10bde49bdb68f17caccc1ba0546e462df4384ca1658dd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:08:31Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-psjs7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:25Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:25 crc kubenswrapper[4816]: I0316 00:09:25.715169 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"49e35ef6-7a6d-438f-b598-4445d73db379\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fbe90b49c5d42bb9d6d5d1b8f1d02b571403dd3f39d3118081351d72c4910914\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://508e6dbe6f2f1392d16501e09ee2ec523a3c2a245cd96c6ab773e26e83b8bb6a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://15b5a3223ff0fb5b96e5b87ee836add66498165759f4d6ed8cb6413f091967e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://603bdcbd4a6af5fda7ca09e4823e0db8d7ec3e3eb38eddf9f062a14b433cb094\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8b061f16a65a27b9a5ef66231376b70fec084479a991b7fdd430957df76da32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b19c46be8bb0c6604bb4f702d14472b782280ce716a3b5fe87a4fa4f1b15b174\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b19c46be8bb0c6604bb4f702d14472b782280ce716a3b5fe87a4fa4f1b15b174\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://baceacd0418c0bbcbd2e875008286443b263336ba956d8e4584147ed949b3f69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://baceacd0418c0bbcbd2e875008286443b263336ba956d8e4584147ed949b3f69\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:49Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://24e7adc2ddce8a281cd270db47b3528d2a4965ab6d41a14ecb5aaea02daf0c45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://24e7adc2ddce8a281cd270db47b3528d2a4965ab6d41a14ecb5aaea02daf0c45\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:06:47Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:25Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:25 crc kubenswrapper[4816]: I0316 00:09:25.726155 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-szscw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e9789e58-12c8-4831-9401-af48a3e92209\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:09:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:09:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b45d6058e72f7117fdfb86b3480de77c8592dba7dcd7932b2a5c34036da7af26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e6f0b7e8f95bbbf3b90133349b8b13d2784582a5d95dae8dd159328b570ae962\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-16T00:09:18Z\\\",\\\"message\\\":\\\"2026-03-16T00:08:33+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_c9e0d937-8428-483b-a1c0-f9ef411a7cbe\\\\n2026-03-16T00:08:33+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_c9e0d937-8428-483b-a1c0-f9ef411a7cbe to /host/opt/cni/bin/\\\\n2026-03-16T00:08:33Z [verbose] multus-daemon started\\\\n2026-03-16T00:08:33Z [verbose] Readiness Indicator file check\\\\n2026-03-16T00:09:18Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:31Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:09:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mxf6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:08:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-szscw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:25Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:25 crc kubenswrapper[4816]: I0316 00:09:25.736928 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-jrdcz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dd08ece2-7636-4966-973a-e96a34b70b53\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6dd22d6548986bf126a8ce6b30c7c11d8c37bd0c03539a55cc28fb7debcfc26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trwbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7003a4592a48f87ab63e9f5637c7665040f9784a7c8f599c82ed61270ddb793c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trwbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:08:31Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-jrdcz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:25Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:26 crc kubenswrapper[4816]: I0316 00:09:26.463660 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-psjs7_2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88/ovnkube-controller/3.log" Mar 16 00:09:26 crc kubenswrapper[4816]: I0316 00:09:26.464922 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-psjs7_2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88/ovnkube-controller/2.log" Mar 16 00:09:26 crc kubenswrapper[4816]: I0316 00:09:26.468788 4816 generic.go:334] "Generic (PLEG): container finished" podID="2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88" containerID="ba7195b779774df29cc722244d9b4290e7f75033f6e135e41288fbb5310a7c3c" exitCode=1 Mar 16 00:09:26 crc kubenswrapper[4816]: I0316 00:09:26.468866 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-psjs7" event={"ID":"2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88","Type":"ContainerDied","Data":"ba7195b779774df29cc722244d9b4290e7f75033f6e135e41288fbb5310a7c3c"} Mar 16 00:09:26 crc kubenswrapper[4816]: I0316 00:09:26.468937 4816 scope.go:117] "RemoveContainer" containerID="cf50ee7a67dc259915505ac9bbca0a709a016452caaeb0ffa14e5a5947db134e" Mar 16 00:09:26 crc kubenswrapper[4816]: I0316 00:09:26.470230 4816 scope.go:117] "RemoveContainer" containerID="ba7195b779774df29cc722244d9b4290e7f75033f6e135e41288fbb5310a7c3c" Mar 16 00:09:26 crc kubenswrapper[4816]: E0316 00:09:26.470586 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-psjs7_openshift-ovn-kubernetes(2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88)\"" pod="openshift-ovn-kubernetes/ovnkube-node-psjs7" podUID="2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88" Mar 16 00:09:26 crc kubenswrapper[4816]: I0316 00:09:26.506446 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"49e35ef6-7a6d-438f-b598-4445d73db379\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fbe90b49c5d42bb9d6d5d1b8f1d02b571403dd3f39d3118081351d72c4910914\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://508e6dbe6f2f1392d16501e09ee2ec523a3c2a245cd96c6ab773e26e83b8bb6a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://15b5a3223ff0fb5b96e5b87ee836add66498165759f4d6ed8cb6413f091967e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://603bdcbd4a6af5fda7ca09e4823e0db8d7ec3e3eb38eddf9f062a14b433cb094\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8b061f16a65a27b9a5ef66231376b70fec084479a991b7fdd430957df76da32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b19c46be8bb0c6604bb4f702d14472b782280ce716a3b5fe87a4fa4f1b15b174\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b19c46be8bb0c6604bb4f702d14472b782280ce716a3b5fe87a4fa4f1b15b174\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://baceacd0418c0bbcbd2e875008286443b263336ba956d8e4584147ed949b3f69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://baceacd0418c0bbcbd2e875008286443b263336ba956d8e4584147ed949b3f69\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:49Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://24e7adc2ddce8a281cd270db47b3528d2a4965ab6d41a14ecb5aaea02daf0c45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://24e7adc2ddce8a281cd270db47b3528d2a4965ab6d41a14ecb5aaea02daf0c45\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:06:47Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:26Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:26 crc kubenswrapper[4816]: I0316 00:09:26.525660 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-szscw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e9789e58-12c8-4831-9401-af48a3e92209\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:09:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:09:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b45d6058e72f7117fdfb86b3480de77c8592dba7dcd7932b2a5c34036da7af26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e6f0b7e8f95bbbf3b90133349b8b13d2784582a5d95dae8dd159328b570ae962\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-16T00:09:18Z\\\",\\\"message\\\":\\\"2026-03-16T00:08:33+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_c9e0d937-8428-483b-a1c0-f9ef411a7cbe\\\\n2026-03-16T00:08:33+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_c9e0d937-8428-483b-a1c0-f9ef411a7cbe to /host/opt/cni/bin/\\\\n2026-03-16T00:08:33Z [verbose] multus-daemon started\\\\n2026-03-16T00:08:33Z [verbose] Readiness Indicator file check\\\\n2026-03-16T00:09:18Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:31Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:09:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mxf6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:08:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-szscw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:26Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:26 crc kubenswrapper[4816]: I0316 00:09:26.543127 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-jrdcz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dd08ece2-7636-4966-973a-e96a34b70b53\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6dd22d6548986bf126a8ce6b30c7c11d8c37bd0c03539a55cc28fb7debcfc26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trwbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7003a4592a48f87ab63e9f5637c7665040f9784a7c8f599c82ed61270ddb793c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trwbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:08:31Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-jrdcz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:26Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:26 crc kubenswrapper[4816]: I0316 00:09:26.566474 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-psjs7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f47b6c54d5d72d827208f4d8228be260d9d19b4a720b3174349288dee2916641\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e0a510814d60106574e3cd3e612c031ac900caa0f24a12a62d7cd1f6bfda1ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://826495d8ab8f414169bf36159278da0a040fec699a74b4aebdc8f87ccdc48bd6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa5f052d32ad8f52f33ee5baea5af98a64f05b831b86ebced6c9422fb4845fcf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://86c28f1ae2f2fb61b7755ac14bc7f5346036735c75daccaa0e31b4e606cfb4c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0fac0a986edf984b0e34eda0f969205d5bc92f23d0edacb3c6dd8721eaed2fb9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba7195b779774df29cc722244d9b4290e7f75033f6e135e41288fbb5310a7c3c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cf50ee7a67dc259915505ac9bbca0a709a016452caaeb0ffa14e5a5947db134e\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-16T00:08:57Z\\\",\\\"message\\\":\\\"} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:57Z is after 2025-08-24T17:21:41Z]\\\\nI0316 00:08:57.626466 7110 services_controller.go:356] Processing sync for service openshift-machine-config-operator/machine-config-controller for network=default\\\\nI0316 00:08:57.626473 7110 services_controller.go:473] Services do not match for network=default, existing lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-kube-storage-version-migrator-operator/metrics_TCP_cluster\\\\\\\", UUID:\\\\\\\"7f9b8f25-db1a-4d02-a423-9afc5c2fb83c\\\\\\\", Protocol:\\\\\\\"tcp\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-kube-storage-version-migrator-operator/metrics\\\\\\\"}, Opts:services.LBOpts{Reject:false, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}},\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:56Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ba7195b779774df29cc722244d9b4290e7f75033f6e135e41288fbb5310a7c3c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-16T00:09:25Z\\\",\\\"message\\\":\\\":[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {960d98b2-dc64-4e93-a4b6-9b19847af71e}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0316 00:09:25.816315 7422 services_controller.go:443] Built service openshift-etcd-operator/metrics LB cluster-wide configs for network=default: []services.lbConfig{services.lbConfig{vips:[]string{\\\\\\\"10.217.5.188\\\\\\\"}, protocol:\\\\\\\"TCP\\\\\\\", inport:443, clusterEndpoints:services.lbEndpoints{Port:0, V4IPs:[]string(nil), V6IPs:[]string(nil)}, nodeEndpoints:map[string]services.lbEndpoints{}, externalTrafficLocal:false, internalTrafficLocal:false, hasNodePort:false}}\\\\nI0316 00:09:25.816331 7422 default_network_controller.go:776] Recording success event on pod openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-kbgw5\\\\nI0316 00:09:25.816342 7422 services_controller.go:444] Built service openshift-etcd-operator/metrics LB per-node configs for network=default: []services.lbConfig(nil)\\\\nI0316 00:09:25.815541 7422 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-diagnostics/network-check-target-xd92c\\\\nI0316 00:09:25.815313 7422 services_controller.go:443] Built service openshift-machine-config-operator/machine-config-controller LB cluster-wide configs for network=default: []services.lbConfig{services.lbConfig{vips:[]string{\\\\\\\"10.217.5.16\\\\\\\"}, protocol:\\\\\\\"TCP\\\\\\\", inport:9001, clus\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-16T00:09:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4d6bf581280690c99ddbbecd4cf4e215a12230d377978aa48acd3c05b85a3c56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1249f8b1c01db98230c10bde49bdb68f17caccc1ba0546e462df4384ca1658dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1249f8b1c01db98230c10bde49bdb68f17caccc1ba0546e462df4384ca1658dd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:08:31Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-psjs7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:26Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:26 crc kubenswrapper[4816]: I0316 00:09:26.581790 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"580d9565-d55c-4108-af2d-07ab6fa7ea2e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f2a41f624c9d40a637cf958648b795587652d7139d4b7cea2e8c70d1cdf05dd6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://59fcd3602ca5395b4f9b934768cad48d5504480af46a1fff8203eb46415b8769\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://59fcd3602ca5395b4f9b934768cad48d5504480af46a1fff8203eb46415b8769\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:06:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:26Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:26 crc kubenswrapper[4816]: I0316 00:09:26.598115 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://723654bc91ce4d40a27b14a9c4df1ed6c8dff2517f378927d7a6a96aeaa6951b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:26Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:26 crc kubenswrapper[4816]: I0316 00:09:26.615700 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://37869baff974556c22aadadfa4b528b5d6b9ad236107b4dbc945b3cdbf743f20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f9fe727708fdaca68596b7d305629ade45b061a88a81085f885d490ab99e24b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:26Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:26 crc kubenswrapper[4816]: I0316 00:09:26.626946 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-cnhkf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e686cd4-bddf-463e-b471-e49ea862691e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f6c3cacd4d7f6866e66b7234b280540ef2c443531de7a11f6894323ef24a9fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bwzt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:08:31Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-cnhkf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:26Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:26 crc kubenswrapper[4816]: I0316 00:09:26.643126 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-kbgw5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7b28986d-e33b-4876-ab6d-64d69960fb8b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://544bab5013b513f755dd9fa88cbc71021b0cfe823aa7508af994c0b1342c3646\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zgr7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0beb1aba8b8813abd98809200fa4dd348377427596c2935ceb012361634df2b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zgr7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:08:44Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-kbgw5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:26Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:26 crc kubenswrapper[4816]: I0316 00:09:26.659610 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c8786d1-025d-4788-bafe-c2c2eaf8e398\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:09:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:09:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://46db23088e8ba791d736f70a65bd066af80a5ec27c31332d9ac9c52530b21bfd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://da7b257af12b9ee605dc32a26d9565f866a9cafebb4936a0bde7f42ae9909f80\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2523ea43f2490afb81354be8a2840f54a3be1a8d3154196e0c8ac365d09723a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e7f3b9b1fc1648908bbf44f9ad32e1eb510b68336a8b60c8417df2f622a36e5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://29c4319e41bd025417b11cb87b682a66698ba7575801f247b1192dc8067ab66f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-16T00:07:59Z\\\",\\\"message\\\":\\\"le observer\\\\nW0316 00:07:59.373922 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0316 00:07:59.374075 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0316 00:07:59.374860 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2696044728/tls.crt::/tmp/serving-cert-2696044728/tls.key\\\\\\\"\\\\nI0316 00:07:59.560928 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0316 00:07:59.563996 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0316 00:07:59.564013 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0316 00:07:59.564035 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0316 00:07:59.564041 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0316 00:07:59.571475 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0316 00:07:59.571523 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0316 00:07:59.571531 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0316 00:07:59.571540 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0316 00:07:59.571574 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0316 00:07:59.571588 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nI0316 00:07:59.571493 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0316 00:07:59.571597 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0316 00:07:59.573484 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-16T00:07:58Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c81d0c40be9c3912864280cd98481f837ef29f69d85599b39decf5e9d7a97eff\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:50Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e39d816de31fa1c42a254ec61527496e7e85121f67007a774524167b1063cf1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e39d816de31fa1c42a254ec61527496e7e85121f67007a774524167b1063cf1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:06:47Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:26Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:26 crc kubenswrapper[4816]: I0316 00:09:26.667640 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 16 00:09:26 crc kubenswrapper[4816]: E0316 00:09:26.667788 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 16 00:09:26 crc kubenswrapper[4816]: I0316 00:09:26.667845 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 16 00:09:26 crc kubenswrapper[4816]: I0316 00:09:26.667978 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jqsjn" Mar 16 00:09:26 crc kubenswrapper[4816]: E0316 00:09:26.668168 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 16 00:09:26 crc kubenswrapper[4816]: I0316 00:09:26.668514 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 16 00:09:26 crc kubenswrapper[4816]: E0316 00:09:26.668674 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 16 00:09:26 crc kubenswrapper[4816]: E0316 00:09:26.668839 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jqsjn" podUID="84360ef9-0450-44c5-80eb-eab1bf8e808b" Mar 16 00:09:26 crc kubenswrapper[4816]: I0316 00:09:26.674383 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://516852108b3d17a15676573ff080d3d94fa48d8076ed0404a8d827e715c2555e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:26Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:26 crc kubenswrapper[4816]: I0316 00:09:26.690368 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:26Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:26 crc kubenswrapper[4816]: I0316 00:09:26.717743 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-mt7bq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"03ef49f1-0c6a-443a-8df3-2db339c562ed\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d31459f93f54cc1cd99638b1f108b8997b34f246cf8c65e5462e672f04aaa7c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8979f751ec4797986924a3efea84d02bc1c1cf303e1e992f349c9083f2fe4f08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8979f751ec4797986924a3efea84d02bc1c1cf303e1e992f349c9083f2fe4f08\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa87691b0dc314209d88d1f4d5b2227abc1de527e8111831bb8243ba6192c300\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fa87691b0dc314209d88d1f4d5b2227abc1de527e8111831bb8243ba6192c300\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:08:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://33f82b993e3d729a5830e88136e74687c930d6ad1a15c64f003f311fda418bce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://33f82b993e3d729a5830e88136e74687c930d6ad1a15c64f003f311fda418bce\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:08:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b640c26797535db11051d50aefd13b7b2759cfa19d163f53764b0fe8710cefdc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b640c26797535db11051d50aefd13b7b2759cfa19d163f53764b0fe8710cefdc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:08:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe5359762a36b39c9f8ae939a1b9d094c42ffab59c4b9743a83b4391c171a512\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fe5359762a36b39c9f8ae939a1b9d094c42ffab59c4b9743a83b4391c171a512\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:08:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc31f057b776cf1925d265b64c7a9a9fb0993e4f3ffff530a48f28865e1c2954\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dc31f057b776cf1925d265b64c7a9a9fb0993e4f3ffff530a48f28865e1c2954\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:08:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:08:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-mt7bq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:26Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:26 crc kubenswrapper[4816]: I0316 00:09:26.734232 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-lhpbn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6ec6a8ee-efd9-45df-bb35-706fcc90ebe9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a34b33696ef85102800de34359b2f8b4bf11c03ca6347dde766456bcea20e0e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c2x8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:08:38Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-lhpbn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:26Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:26 crc kubenswrapper[4816]: I0316 00:09:26.751697 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7a695b85-75f5-42bb-a13c-aa20512355e4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b6e7e97dbf041ed59e094f9be34956cb28d4774022777f2371a2eb752937f551\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fea427361067bc1a9d56b7b6699b072b5cdeee8345bf6618b8d81c6848f62098\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-16T00:07:20Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0316 00:06:49.819775 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0316 00:06:49.823008 1 observer_polling.go:159] Starting file observer\\\\nI0316 00:06:49.860579 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0316 00:06:49.866006 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nF0316 00:07:20.134248 1 cmd.go:179] failed checking apiserver connectivity: Get \\\\\\\"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/openshift-kube-controller-manager/leases/cluster-policy-controller-lock\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:07:19Z is after 2026-02-23T05:33:13Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:49Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5f3dfc8f46079b51e52802920a734bf796a00db2cf42501b9f7e202e0e9bc2ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5afc1a568801d941c814fe5e4eb91df609672a9431082d578d600f68f669aa3f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c60304164aa2f8e740ab25d779ef9b0aa66f6acca476bae45a13f4be44e37e33\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:06:47Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:26Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:26 crc kubenswrapper[4816]: I0316 00:09:26.767123 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3a9d51fa-9450-4f18-be16-0a93d64889b4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://94c8e0e7b0ecd4d0f525f55538bff0b01653544be87c923fc152d4d982da7021\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://02c81b3d94044c1f4f344dd8306cacf8db533a662535ddb0a6a4ffc1bb4cf1f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://37dea82cd38ef9cfbda3626b789b21af2ada4b4a58bc2ecc6ce55eda60052277\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e44b0bbe4cebe4e52726f3f654f8b4a2953abcd0ad32db752c4641b3f0be1b00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e44b0bbe4cebe4e52726f3f654f8b4a2953abcd0ad32db752c4641b3f0be1b00\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:48Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:06:47Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:26Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:26 crc kubenswrapper[4816]: I0316 00:09:26.786326 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:26Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:26 crc kubenswrapper[4816]: I0316 00:09:26.806985 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:26Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:26 crc kubenswrapper[4816]: I0316 00:09:26.825355 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-jqsjn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"84360ef9-0450-44c5-80eb-eab1bf8e808b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxldb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxldb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:08:45Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-jqsjn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:26Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:27 crc kubenswrapper[4816]: I0316 00:09:27.476025 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-psjs7_2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88/ovnkube-controller/3.log" Mar 16 00:09:27 crc kubenswrapper[4816]: I0316 00:09:27.481884 4816 scope.go:117] "RemoveContainer" containerID="ba7195b779774df29cc722244d9b4290e7f75033f6e135e41288fbb5310a7c3c" Mar 16 00:09:27 crc kubenswrapper[4816]: E0316 00:09:27.482180 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-psjs7_openshift-ovn-kubernetes(2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88)\"" pod="openshift-ovn-kubernetes/ovnkube-node-psjs7" podUID="2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88" Mar 16 00:09:27 crc kubenswrapper[4816]: I0316 00:09:27.502153 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://516852108b3d17a15676573ff080d3d94fa48d8076ed0404a8d827e715c2555e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:27Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:27 crc kubenswrapper[4816]: I0316 00:09:27.521530 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:27Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:27 crc kubenswrapper[4816]: I0316 00:09:27.546442 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-mt7bq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"03ef49f1-0c6a-443a-8df3-2db339c562ed\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d31459f93f54cc1cd99638b1f108b8997b34f246cf8c65e5462e672f04aaa7c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8979f751ec4797986924a3efea84d02bc1c1cf303e1e992f349c9083f2fe4f08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8979f751ec4797986924a3efea84d02bc1c1cf303e1e992f349c9083f2fe4f08\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa87691b0dc314209d88d1f4d5b2227abc1de527e8111831bb8243ba6192c300\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fa87691b0dc314209d88d1f4d5b2227abc1de527e8111831bb8243ba6192c300\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:08:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://33f82b993e3d729a5830e88136e74687c930d6ad1a15c64f003f311fda418bce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://33f82b993e3d729a5830e88136e74687c930d6ad1a15c64f003f311fda418bce\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:08:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b640c26797535db11051d50aefd13b7b2759cfa19d163f53764b0fe8710cefdc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b640c26797535db11051d50aefd13b7b2759cfa19d163f53764b0fe8710cefdc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:08:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe5359762a36b39c9f8ae939a1b9d094c42ffab59c4b9743a83b4391c171a512\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fe5359762a36b39c9f8ae939a1b9d094c42ffab59c4b9743a83b4391c171a512\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:08:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc31f057b776cf1925d265b64c7a9a9fb0993e4f3ffff530a48f28865e1c2954\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dc31f057b776cf1925d265b64c7a9a9fb0993e4f3ffff530a48f28865e1c2954\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:08:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:08:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-mt7bq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:27Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:27 crc kubenswrapper[4816]: I0316 00:09:27.563877 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-lhpbn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6ec6a8ee-efd9-45df-bb35-706fcc90ebe9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a34b33696ef85102800de34359b2f8b4bf11c03ca6347dde766456bcea20e0e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c2x8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:08:38Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-lhpbn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:27Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:27 crc kubenswrapper[4816]: I0316 00:09:27.589446 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c8786d1-025d-4788-bafe-c2c2eaf8e398\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:09:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:09:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://46db23088e8ba791d736f70a65bd066af80a5ec27c31332d9ac9c52530b21bfd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://da7b257af12b9ee605dc32a26d9565f866a9cafebb4936a0bde7f42ae9909f80\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2523ea43f2490afb81354be8a2840f54a3be1a8d3154196e0c8ac365d09723a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e7f3b9b1fc1648908bbf44f9ad32e1eb510b68336a8b60c8417df2f622a36e5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://29c4319e41bd025417b11cb87b682a66698ba7575801f247b1192dc8067ab66f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-16T00:07:59Z\\\",\\\"message\\\":\\\"le observer\\\\nW0316 00:07:59.373922 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0316 00:07:59.374075 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0316 00:07:59.374860 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2696044728/tls.crt::/tmp/serving-cert-2696044728/tls.key\\\\\\\"\\\\nI0316 00:07:59.560928 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0316 00:07:59.563996 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0316 00:07:59.564013 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0316 00:07:59.564035 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0316 00:07:59.564041 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0316 00:07:59.571475 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0316 00:07:59.571523 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0316 00:07:59.571531 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0316 00:07:59.571540 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0316 00:07:59.571574 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0316 00:07:59.571588 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nI0316 00:07:59.571493 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0316 00:07:59.571597 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0316 00:07:59.573484 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-16T00:07:58Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c81d0c40be9c3912864280cd98481f837ef29f69d85599b39decf5e9d7a97eff\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:50Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e39d816de31fa1c42a254ec61527496e7e85121f67007a774524167b1063cf1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e39d816de31fa1c42a254ec61527496e7e85121f67007a774524167b1063cf1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:06:47Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:27Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:27 crc kubenswrapper[4816]: I0316 00:09:27.610781 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3a9d51fa-9450-4f18-be16-0a93d64889b4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://94c8e0e7b0ecd4d0f525f55538bff0b01653544be87c923fc152d4d982da7021\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://02c81b3d94044c1f4f344dd8306cacf8db533a662535ddb0a6a4ffc1bb4cf1f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://37dea82cd38ef9cfbda3626b789b21af2ada4b4a58bc2ecc6ce55eda60052277\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e44b0bbe4cebe4e52726f3f654f8b4a2953abcd0ad32db752c4641b3f0be1b00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e44b0bbe4cebe4e52726f3f654f8b4a2953abcd0ad32db752c4641b3f0be1b00\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:48Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:06:47Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:27Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:27 crc kubenswrapper[4816]: I0316 00:09:27.629389 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:27Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:27 crc kubenswrapper[4816]: I0316 00:09:27.649465 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:27Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:27 crc kubenswrapper[4816]: I0316 00:09:27.668045 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-jqsjn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"84360ef9-0450-44c5-80eb-eab1bf8e808b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxldb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxldb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:08:45Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-jqsjn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:27Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:27 crc kubenswrapper[4816]: I0316 00:09:27.691674 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7a695b85-75f5-42bb-a13c-aa20512355e4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b6e7e97dbf041ed59e094f9be34956cb28d4774022777f2371a2eb752937f551\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fea427361067bc1a9d56b7b6699b072b5cdeee8345bf6618b8d81c6848f62098\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-16T00:07:20Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0316 00:06:49.819775 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0316 00:06:49.823008 1 observer_polling.go:159] Starting file observer\\\\nI0316 00:06:49.860579 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0316 00:06:49.866006 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nF0316 00:07:20.134248 1 cmd.go:179] failed checking apiserver connectivity: Get \\\\\\\"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/openshift-kube-controller-manager/leases/cluster-policy-controller-lock\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:07:19Z is after 2026-02-23T05:33:13Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:49Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5f3dfc8f46079b51e52802920a734bf796a00db2cf42501b9f7e202e0e9bc2ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5afc1a568801d941c814fe5e4eb91df609672a9431082d578d600f68f669aa3f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c60304164aa2f8e740ab25d779ef9b0aa66f6acca476bae45a13f4be44e37e33\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:06:47Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:27Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:27 crc kubenswrapper[4816]: I0316 00:09:27.713723 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-szscw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e9789e58-12c8-4831-9401-af48a3e92209\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:09:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:09:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b45d6058e72f7117fdfb86b3480de77c8592dba7dcd7932b2a5c34036da7af26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e6f0b7e8f95bbbf3b90133349b8b13d2784582a5d95dae8dd159328b570ae962\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-16T00:09:18Z\\\",\\\"message\\\":\\\"2026-03-16T00:08:33+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_c9e0d937-8428-483b-a1c0-f9ef411a7cbe\\\\n2026-03-16T00:08:33+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_c9e0d937-8428-483b-a1c0-f9ef411a7cbe to /host/opt/cni/bin/\\\\n2026-03-16T00:08:33Z [verbose] multus-daemon started\\\\n2026-03-16T00:08:33Z [verbose] Readiness Indicator file check\\\\n2026-03-16T00:09:18Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:31Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:09:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mxf6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:08:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-szscw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:27Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:27 crc kubenswrapper[4816]: I0316 00:09:27.735592 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-jrdcz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dd08ece2-7636-4966-973a-e96a34b70b53\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6dd22d6548986bf126a8ce6b30c7c11d8c37bd0c03539a55cc28fb7debcfc26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trwbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7003a4592a48f87ab63e9f5637c7665040f9784a7c8f599c82ed61270ddb793c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trwbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:08:31Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-jrdcz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:27Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:27 crc kubenswrapper[4816]: I0316 00:09:27.768159 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-psjs7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f47b6c54d5d72d827208f4d8228be260d9d19b4a720b3174349288dee2916641\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e0a510814d60106574e3cd3e612c031ac900caa0f24a12a62d7cd1f6bfda1ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://826495d8ab8f414169bf36159278da0a040fec699a74b4aebdc8f87ccdc48bd6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa5f052d32ad8f52f33ee5baea5af98a64f05b831b86ebced6c9422fb4845fcf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://86c28f1ae2f2fb61b7755ac14bc7f5346036735c75daccaa0e31b4e606cfb4c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0fac0a986edf984b0e34eda0f969205d5bc92f23d0edacb3c6dd8721eaed2fb9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba7195b779774df29cc722244d9b4290e7f75033f6e135e41288fbb5310a7c3c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ba7195b779774df29cc722244d9b4290e7f75033f6e135e41288fbb5310a7c3c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-16T00:09:25Z\\\",\\\"message\\\":\\\":[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {960d98b2-dc64-4e93-a4b6-9b19847af71e}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0316 00:09:25.816315 7422 services_controller.go:443] Built service openshift-etcd-operator/metrics LB cluster-wide configs for network=default: []services.lbConfig{services.lbConfig{vips:[]string{\\\\\\\"10.217.5.188\\\\\\\"}, protocol:\\\\\\\"TCP\\\\\\\", inport:443, clusterEndpoints:services.lbEndpoints{Port:0, V4IPs:[]string(nil), V6IPs:[]string(nil)}, nodeEndpoints:map[string]services.lbEndpoints{}, externalTrafficLocal:false, internalTrafficLocal:false, hasNodePort:false}}\\\\nI0316 00:09:25.816331 7422 default_network_controller.go:776] Recording success event on pod openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-kbgw5\\\\nI0316 00:09:25.816342 7422 services_controller.go:444] Built service openshift-etcd-operator/metrics LB per-node configs for network=default: []services.lbConfig(nil)\\\\nI0316 00:09:25.815541 7422 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-diagnostics/network-check-target-xd92c\\\\nI0316 00:09:25.815313 7422 services_controller.go:443] Built service openshift-machine-config-operator/machine-config-controller LB cluster-wide configs for network=default: []services.lbConfig{services.lbConfig{vips:[]string{\\\\\\\"10.217.5.16\\\\\\\"}, protocol:\\\\\\\"TCP\\\\\\\", inport:9001, clus\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-16T00:09:24Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-psjs7_openshift-ovn-kubernetes(2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4d6bf581280690c99ddbbecd4cf4e215a12230d377978aa48acd3c05b85a3c56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1249f8b1c01db98230c10bde49bdb68f17caccc1ba0546e462df4384ca1658dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1249f8b1c01db98230c10bde49bdb68f17caccc1ba0546e462df4384ca1658dd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:08:31Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-psjs7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:27Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:27 crc kubenswrapper[4816]: E0316 00:09:27.795982 4816 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 16 00:09:27 crc kubenswrapper[4816]: I0316 00:09:27.807849 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"49e35ef6-7a6d-438f-b598-4445d73db379\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fbe90b49c5d42bb9d6d5d1b8f1d02b571403dd3f39d3118081351d72c4910914\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://508e6dbe6f2f1392d16501e09ee2ec523a3c2a245cd96c6ab773e26e83b8bb6a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://15b5a3223ff0fb5b96e5b87ee836add66498165759f4d6ed8cb6413f091967e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://603bdcbd4a6af5fda7ca09e4823e0db8d7ec3e3eb38eddf9f062a14b433cb094\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8b061f16a65a27b9a5ef66231376b70fec084479a991b7fdd430957df76da32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b19c46be8bb0c6604bb4f702d14472b782280ce716a3b5fe87a4fa4f1b15b174\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b19c46be8bb0c6604bb4f702d14472b782280ce716a3b5fe87a4fa4f1b15b174\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://baceacd0418c0bbcbd2e875008286443b263336ba956d8e4584147ed949b3f69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://baceacd0418c0bbcbd2e875008286443b263336ba956d8e4584147ed949b3f69\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:49Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://24e7adc2ddce8a281cd270db47b3528d2a4965ab6d41a14ecb5aaea02daf0c45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://24e7adc2ddce8a281cd270db47b3528d2a4965ab6d41a14ecb5aaea02daf0c45\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:06:47Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:27Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:27 crc kubenswrapper[4816]: I0316 00:09:27.830785 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://723654bc91ce4d40a27b14a9c4df1ed6c8dff2517f378927d7a6a96aeaa6951b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:27Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:27 crc kubenswrapper[4816]: I0316 00:09:27.850368 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://37869baff974556c22aadadfa4b528b5d6b9ad236107b4dbc945b3cdbf743f20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f9fe727708fdaca68596b7d305629ade45b061a88a81085f885d490ab99e24b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:27Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:27 crc kubenswrapper[4816]: I0316 00:09:27.865219 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-cnhkf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e686cd4-bddf-463e-b471-e49ea862691e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f6c3cacd4d7f6866e66b7234b280540ef2c443531de7a11f6894323ef24a9fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bwzt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:08:31Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-cnhkf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:27Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:27 crc kubenswrapper[4816]: I0316 00:09:27.880527 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-kbgw5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7b28986d-e33b-4876-ab6d-64d69960fb8b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://544bab5013b513f755dd9fa88cbc71021b0cfe823aa7508af994c0b1342c3646\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zgr7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0beb1aba8b8813abd98809200fa4dd348377427596c2935ceb012361634df2b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zgr7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:08:44Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-kbgw5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:27Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:27 crc kubenswrapper[4816]: I0316 00:09:27.896471 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"580d9565-d55c-4108-af2d-07ab6fa7ea2e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f2a41f624c9d40a637cf958648b795587652d7139d4b7cea2e8c70d1cdf05dd6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://59fcd3602ca5395b4f9b934768cad48d5504480af46a1fff8203eb46415b8769\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://59fcd3602ca5395b4f9b934768cad48d5504480af46a1fff8203eb46415b8769\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:06:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:27Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:27 crc kubenswrapper[4816]: I0316 00:09:27.917505 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://723654bc91ce4d40a27b14a9c4df1ed6c8dff2517f378927d7a6a96aeaa6951b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:27Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:27 crc kubenswrapper[4816]: I0316 00:09:27.938637 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://37869baff974556c22aadadfa4b528b5d6b9ad236107b4dbc945b3cdbf743f20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f9fe727708fdaca68596b7d305629ade45b061a88a81085f885d490ab99e24b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:27Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:27 crc kubenswrapper[4816]: I0316 00:09:27.957390 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-cnhkf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e686cd4-bddf-463e-b471-e49ea862691e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f6c3cacd4d7f6866e66b7234b280540ef2c443531de7a11f6894323ef24a9fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bwzt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:08:31Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-cnhkf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:27Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:27 crc kubenswrapper[4816]: I0316 00:09:27.976835 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-kbgw5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7b28986d-e33b-4876-ab6d-64d69960fb8b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://544bab5013b513f755dd9fa88cbc71021b0cfe823aa7508af994c0b1342c3646\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zgr7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0beb1aba8b8813abd98809200fa4dd348377427596c2935ceb012361634df2b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zgr7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:08:44Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-kbgw5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:27Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:27 crc kubenswrapper[4816]: I0316 00:09:27.994295 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"580d9565-d55c-4108-af2d-07ab6fa7ea2e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f2a41f624c9d40a637cf958648b795587652d7139d4b7cea2e8c70d1cdf05dd6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://59fcd3602ca5395b4f9b934768cad48d5504480af46a1fff8203eb46415b8769\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://59fcd3602ca5395b4f9b934768cad48d5504480af46a1fff8203eb46415b8769\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:06:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:27Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:28 crc kubenswrapper[4816]: I0316 00:09:28.014987 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://516852108b3d17a15676573ff080d3d94fa48d8076ed0404a8d827e715c2555e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:28Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:28 crc kubenswrapper[4816]: I0316 00:09:28.033641 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:28Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:28 crc kubenswrapper[4816]: I0316 00:09:28.055228 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-mt7bq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"03ef49f1-0c6a-443a-8df3-2db339c562ed\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d31459f93f54cc1cd99638b1f108b8997b34f246cf8c65e5462e672f04aaa7c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8979f751ec4797986924a3efea84d02bc1c1cf303e1e992f349c9083f2fe4f08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8979f751ec4797986924a3efea84d02bc1c1cf303e1e992f349c9083f2fe4f08\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa87691b0dc314209d88d1f4d5b2227abc1de527e8111831bb8243ba6192c300\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fa87691b0dc314209d88d1f4d5b2227abc1de527e8111831bb8243ba6192c300\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:08:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://33f82b993e3d729a5830e88136e74687c930d6ad1a15c64f003f311fda418bce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://33f82b993e3d729a5830e88136e74687c930d6ad1a15c64f003f311fda418bce\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:08:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b640c26797535db11051d50aefd13b7b2759cfa19d163f53764b0fe8710cefdc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b640c26797535db11051d50aefd13b7b2759cfa19d163f53764b0fe8710cefdc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:08:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe5359762a36b39c9f8ae939a1b9d094c42ffab59c4b9743a83b4391c171a512\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fe5359762a36b39c9f8ae939a1b9d094c42ffab59c4b9743a83b4391c171a512\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:08:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc31f057b776cf1925d265b64c7a9a9fb0993e4f3ffff530a48f28865e1c2954\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dc31f057b776cf1925d265b64c7a9a9fb0993e4f3ffff530a48f28865e1c2954\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:08:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:08:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-mt7bq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:28Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:28 crc kubenswrapper[4816]: I0316 00:09:28.069758 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-lhpbn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6ec6a8ee-efd9-45df-bb35-706fcc90ebe9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a34b33696ef85102800de34359b2f8b4bf11c03ca6347dde766456bcea20e0e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c2x8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:08:38Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-lhpbn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:28Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:28 crc kubenswrapper[4816]: I0316 00:09:28.087888 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c8786d1-025d-4788-bafe-c2c2eaf8e398\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:09:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:09:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://46db23088e8ba791d736f70a65bd066af80a5ec27c31332d9ac9c52530b21bfd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://da7b257af12b9ee605dc32a26d9565f866a9cafebb4936a0bde7f42ae9909f80\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2523ea43f2490afb81354be8a2840f54a3be1a8d3154196e0c8ac365d09723a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e7f3b9b1fc1648908bbf44f9ad32e1eb510b68336a8b60c8417df2f622a36e5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://29c4319e41bd025417b11cb87b682a66698ba7575801f247b1192dc8067ab66f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-16T00:07:59Z\\\",\\\"message\\\":\\\"le observer\\\\nW0316 00:07:59.373922 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0316 00:07:59.374075 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0316 00:07:59.374860 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2696044728/tls.crt::/tmp/serving-cert-2696044728/tls.key\\\\\\\"\\\\nI0316 00:07:59.560928 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0316 00:07:59.563996 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0316 00:07:59.564013 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0316 00:07:59.564035 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0316 00:07:59.564041 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0316 00:07:59.571475 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0316 00:07:59.571523 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0316 00:07:59.571531 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0316 00:07:59.571540 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0316 00:07:59.571574 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0316 00:07:59.571588 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nI0316 00:07:59.571493 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0316 00:07:59.571597 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0316 00:07:59.573484 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-16T00:07:58Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c81d0c40be9c3912864280cd98481f837ef29f69d85599b39decf5e9d7a97eff\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:50Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e39d816de31fa1c42a254ec61527496e7e85121f67007a774524167b1063cf1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e39d816de31fa1c42a254ec61527496e7e85121f67007a774524167b1063cf1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:06:47Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:28Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:28 crc kubenswrapper[4816]: I0316 00:09:28.102652 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3a9d51fa-9450-4f18-be16-0a93d64889b4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://94c8e0e7b0ecd4d0f525f55538bff0b01653544be87c923fc152d4d982da7021\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://02c81b3d94044c1f4f344dd8306cacf8db533a662535ddb0a6a4ffc1bb4cf1f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://37dea82cd38ef9cfbda3626b789b21af2ada4b4a58bc2ecc6ce55eda60052277\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e44b0bbe4cebe4e52726f3f654f8b4a2953abcd0ad32db752c4641b3f0be1b00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e44b0bbe4cebe4e52726f3f654f8b4a2953abcd0ad32db752c4641b3f0be1b00\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:48Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:06:47Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:28Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:28 crc kubenswrapper[4816]: I0316 00:09:28.115263 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:28Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:28 crc kubenswrapper[4816]: I0316 00:09:28.126334 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:28Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:28 crc kubenswrapper[4816]: I0316 00:09:28.136790 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-jqsjn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"84360ef9-0450-44c5-80eb-eab1bf8e808b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxldb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxldb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:08:45Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-jqsjn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:28Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:28 crc kubenswrapper[4816]: I0316 00:09:28.147787 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7a695b85-75f5-42bb-a13c-aa20512355e4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b6e7e97dbf041ed59e094f9be34956cb28d4774022777f2371a2eb752937f551\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fea427361067bc1a9d56b7b6699b072b5cdeee8345bf6618b8d81c6848f62098\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-16T00:07:20Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0316 00:06:49.819775 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0316 00:06:49.823008 1 observer_polling.go:159] Starting file observer\\\\nI0316 00:06:49.860579 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0316 00:06:49.866006 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nF0316 00:07:20.134248 1 cmd.go:179] failed checking apiserver connectivity: Get \\\\\\\"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/openshift-kube-controller-manager/leases/cluster-policy-controller-lock\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:07:19Z is after 2026-02-23T05:33:13Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:49Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5f3dfc8f46079b51e52802920a734bf796a00db2cf42501b9f7e202e0e9bc2ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5afc1a568801d941c814fe5e4eb91df609672a9431082d578d600f68f669aa3f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c60304164aa2f8e740ab25d779ef9b0aa66f6acca476bae45a13f4be44e37e33\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:06:47Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:28Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:28 crc kubenswrapper[4816]: I0316 00:09:28.164158 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-szscw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e9789e58-12c8-4831-9401-af48a3e92209\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:09:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:09:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b45d6058e72f7117fdfb86b3480de77c8592dba7dcd7932b2a5c34036da7af26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e6f0b7e8f95bbbf3b90133349b8b13d2784582a5d95dae8dd159328b570ae962\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-16T00:09:18Z\\\",\\\"message\\\":\\\"2026-03-16T00:08:33+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_c9e0d937-8428-483b-a1c0-f9ef411a7cbe\\\\n2026-03-16T00:08:33+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_c9e0d937-8428-483b-a1c0-f9ef411a7cbe to /host/opt/cni/bin/\\\\n2026-03-16T00:08:33Z [verbose] multus-daemon started\\\\n2026-03-16T00:08:33Z [verbose] Readiness Indicator file check\\\\n2026-03-16T00:09:18Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:31Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:09:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mxf6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:08:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-szscw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:28Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:28 crc kubenswrapper[4816]: I0316 00:09:28.176422 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-jrdcz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dd08ece2-7636-4966-973a-e96a34b70b53\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6dd22d6548986bf126a8ce6b30c7c11d8c37bd0c03539a55cc28fb7debcfc26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trwbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7003a4592a48f87ab63e9f5637c7665040f9784a7c8f599c82ed61270ddb793c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trwbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:08:31Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-jrdcz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:28Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:28 crc kubenswrapper[4816]: I0316 00:09:28.206058 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-psjs7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f47b6c54d5d72d827208f4d8228be260d9d19b4a720b3174349288dee2916641\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e0a510814d60106574e3cd3e612c031ac900caa0f24a12a62d7cd1f6bfda1ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://826495d8ab8f414169bf36159278da0a040fec699a74b4aebdc8f87ccdc48bd6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa5f052d32ad8f52f33ee5baea5af98a64f05b831b86ebced6c9422fb4845fcf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://86c28f1ae2f2fb61b7755ac14bc7f5346036735c75daccaa0e31b4e606cfb4c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0fac0a986edf984b0e34eda0f969205d5bc92f23d0edacb3c6dd8721eaed2fb9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba7195b779774df29cc722244d9b4290e7f75033f6e135e41288fbb5310a7c3c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ba7195b779774df29cc722244d9b4290e7f75033f6e135e41288fbb5310a7c3c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-16T00:09:25Z\\\",\\\"message\\\":\\\":[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {960d98b2-dc64-4e93-a4b6-9b19847af71e}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0316 00:09:25.816315 7422 services_controller.go:443] Built service openshift-etcd-operator/metrics LB cluster-wide configs for network=default: []services.lbConfig{services.lbConfig{vips:[]string{\\\\\\\"10.217.5.188\\\\\\\"}, protocol:\\\\\\\"TCP\\\\\\\", inport:443, clusterEndpoints:services.lbEndpoints{Port:0, V4IPs:[]string(nil), V6IPs:[]string(nil)}, nodeEndpoints:map[string]services.lbEndpoints{}, externalTrafficLocal:false, internalTrafficLocal:false, hasNodePort:false}}\\\\nI0316 00:09:25.816331 7422 default_network_controller.go:776] Recording success event on pod openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-kbgw5\\\\nI0316 00:09:25.816342 7422 services_controller.go:444] Built service openshift-etcd-operator/metrics LB per-node configs for network=default: []services.lbConfig(nil)\\\\nI0316 00:09:25.815541 7422 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-diagnostics/network-check-target-xd92c\\\\nI0316 00:09:25.815313 7422 services_controller.go:443] Built service openshift-machine-config-operator/machine-config-controller LB cluster-wide configs for network=default: []services.lbConfig{services.lbConfig{vips:[]string{\\\\\\\"10.217.5.16\\\\\\\"}, protocol:\\\\\\\"TCP\\\\\\\", inport:9001, clus\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-16T00:09:24Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-psjs7_openshift-ovn-kubernetes(2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4d6bf581280690c99ddbbecd4cf4e215a12230d377978aa48acd3c05b85a3c56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1249f8b1c01db98230c10bde49bdb68f17caccc1ba0546e462df4384ca1658dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1249f8b1c01db98230c10bde49bdb68f17caccc1ba0546e462df4384ca1658dd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:08:31Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-psjs7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:28Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:28 crc kubenswrapper[4816]: I0316 00:09:28.225108 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"49e35ef6-7a6d-438f-b598-4445d73db379\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fbe90b49c5d42bb9d6d5d1b8f1d02b571403dd3f39d3118081351d72c4910914\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://508e6dbe6f2f1392d16501e09ee2ec523a3c2a245cd96c6ab773e26e83b8bb6a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://15b5a3223ff0fb5b96e5b87ee836add66498165759f4d6ed8cb6413f091967e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://603bdcbd4a6af5fda7ca09e4823e0db8d7ec3e3eb38eddf9f062a14b433cb094\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8b061f16a65a27b9a5ef66231376b70fec084479a991b7fdd430957df76da32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b19c46be8bb0c6604bb4f702d14472b782280ce716a3b5fe87a4fa4f1b15b174\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b19c46be8bb0c6604bb4f702d14472b782280ce716a3b5fe87a4fa4f1b15b174\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://baceacd0418c0bbcbd2e875008286443b263336ba956d8e4584147ed949b3f69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://baceacd0418c0bbcbd2e875008286443b263336ba956d8e4584147ed949b3f69\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:49Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://24e7adc2ddce8a281cd270db47b3528d2a4965ab6d41a14ecb5aaea02daf0c45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://24e7adc2ddce8a281cd270db47b3528d2a4965ab6d41a14ecb5aaea02daf0c45\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:06:47Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:28Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:28 crc kubenswrapper[4816]: I0316 00:09:28.667228 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jqsjn" Mar 16 00:09:28 crc kubenswrapper[4816]: I0316 00:09:28.667252 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 16 00:09:28 crc kubenswrapper[4816]: I0316 00:09:28.667321 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 16 00:09:28 crc kubenswrapper[4816]: I0316 00:09:28.667402 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 16 00:09:28 crc kubenswrapper[4816]: E0316 00:09:28.667951 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jqsjn" podUID="84360ef9-0450-44c5-80eb-eab1bf8e808b" Mar 16 00:09:28 crc kubenswrapper[4816]: E0316 00:09:28.668271 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 16 00:09:28 crc kubenswrapper[4816]: E0316 00:09:28.668433 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 16 00:09:28 crc kubenswrapper[4816]: E0316 00:09:28.668537 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 16 00:09:30 crc kubenswrapper[4816]: I0316 00:09:30.626334 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:09:30 crc kubenswrapper[4816]: I0316 00:09:30.626393 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:09:30 crc kubenswrapper[4816]: I0316 00:09:30.626418 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:09:30 crc kubenswrapper[4816]: I0316 00:09:30.626446 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:09:30 crc kubenswrapper[4816]: I0316 00:09:30.626467 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:09:30Z","lastTransitionTime":"2026-03-16T00:09:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:09:30 crc kubenswrapper[4816]: E0316 00:09:30.648757 4816 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:09:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:09:30Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:09:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:09:30Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:09:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:09:30Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:09:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:09:30Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d8fb348c-4907-4c8f-859a-735976530e03\\\",\\\"systemUUID\\\":\\\"e97abf98-298f-4589-a70a-4cfb5cb2994a\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:30Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:30 crc kubenswrapper[4816]: I0316 00:09:30.655530 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:09:30 crc kubenswrapper[4816]: I0316 00:09:30.655818 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:09:30 crc kubenswrapper[4816]: I0316 00:09:30.655994 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:09:30 crc kubenswrapper[4816]: I0316 00:09:30.656158 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:09:30 crc kubenswrapper[4816]: I0316 00:09:30.656302 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:09:30Z","lastTransitionTime":"2026-03-16T00:09:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:09:30 crc kubenswrapper[4816]: I0316 00:09:30.667299 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 16 00:09:30 crc kubenswrapper[4816]: I0316 00:09:30.667416 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 16 00:09:30 crc kubenswrapper[4816]: I0316 00:09:30.667374 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 16 00:09:30 crc kubenswrapper[4816]: E0316 00:09:30.667528 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 16 00:09:30 crc kubenswrapper[4816]: I0316 00:09:30.667321 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jqsjn" Mar 16 00:09:30 crc kubenswrapper[4816]: E0316 00:09:30.667898 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 16 00:09:30 crc kubenswrapper[4816]: E0316 00:09:30.668293 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 16 00:09:30 crc kubenswrapper[4816]: E0316 00:09:30.668623 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jqsjn" podUID="84360ef9-0450-44c5-80eb-eab1bf8e808b" Mar 16 00:09:30 crc kubenswrapper[4816]: E0316 00:09:30.676786 4816 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:09:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:09:30Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:09:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:09:30Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:09:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:09:30Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:09:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:09:30Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d8fb348c-4907-4c8f-859a-735976530e03\\\",\\\"systemUUID\\\":\\\"e97abf98-298f-4589-a70a-4cfb5cb2994a\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:30Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:30 crc kubenswrapper[4816]: I0316 00:09:30.681948 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:09:30 crc kubenswrapper[4816]: I0316 00:09:30.682004 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:09:30 crc kubenswrapper[4816]: I0316 00:09:30.682031 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:09:30 crc kubenswrapper[4816]: I0316 00:09:30.682060 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:09:30 crc kubenswrapper[4816]: I0316 00:09:30.682081 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:09:30Z","lastTransitionTime":"2026-03-16T00:09:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:09:30 crc kubenswrapper[4816]: E0316 00:09:30.702992 4816 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:09:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:09:30Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:09:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:09:30Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:09:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:09:30Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:09:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:09:30Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d8fb348c-4907-4c8f-859a-735976530e03\\\",\\\"systemUUID\\\":\\\"e97abf98-298f-4589-a70a-4cfb5cb2994a\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:30Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:30 crc kubenswrapper[4816]: I0316 00:09:30.708664 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:09:30 crc kubenswrapper[4816]: I0316 00:09:30.708719 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:09:30 crc kubenswrapper[4816]: I0316 00:09:30.708735 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:09:30 crc kubenswrapper[4816]: I0316 00:09:30.708763 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:09:30 crc kubenswrapper[4816]: I0316 00:09:30.708780 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:09:30Z","lastTransitionTime":"2026-03-16T00:09:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:09:30 crc kubenswrapper[4816]: E0316 00:09:30.729509 4816 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:09:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:09:30Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:09:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:09:30Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:09:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:09:30Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:09:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:09:30Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d8fb348c-4907-4c8f-859a-735976530e03\\\",\\\"systemUUID\\\":\\\"e97abf98-298f-4589-a70a-4cfb5cb2994a\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:30Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:30 crc kubenswrapper[4816]: I0316 00:09:30.735198 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:09:30 crc kubenswrapper[4816]: I0316 00:09:30.735261 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:09:30 crc kubenswrapper[4816]: I0316 00:09:30.735285 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:09:30 crc kubenswrapper[4816]: I0316 00:09:30.735309 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:09:30 crc kubenswrapper[4816]: I0316 00:09:30.735326 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:09:30Z","lastTransitionTime":"2026-03-16T00:09:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:09:30 crc kubenswrapper[4816]: E0316 00:09:30.754140 4816 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:09:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:09:30Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:09:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:09:30Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:09:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:09:30Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:09:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:09:30Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d8fb348c-4907-4c8f-859a-735976530e03\\\",\\\"systemUUID\\\":\\\"e97abf98-298f-4589-a70a-4cfb5cb2994a\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:30Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:30 crc kubenswrapper[4816]: E0316 00:09:30.754372 4816 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 16 00:09:32 crc kubenswrapper[4816]: I0316 00:09:32.667653 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 16 00:09:32 crc kubenswrapper[4816]: I0316 00:09:32.667692 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 16 00:09:32 crc kubenswrapper[4816]: I0316 00:09:32.667760 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 16 00:09:32 crc kubenswrapper[4816]: I0316 00:09:32.667891 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jqsjn" Mar 16 00:09:32 crc kubenswrapper[4816]: E0316 00:09:32.667887 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 16 00:09:32 crc kubenswrapper[4816]: E0316 00:09:32.667987 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 16 00:09:32 crc kubenswrapper[4816]: E0316 00:09:32.668202 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jqsjn" podUID="84360ef9-0450-44c5-80eb-eab1bf8e808b" Mar 16 00:09:32 crc kubenswrapper[4816]: E0316 00:09:32.668318 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 16 00:09:32 crc kubenswrapper[4816]: E0316 00:09:32.797288 4816 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 16 00:09:34 crc kubenswrapper[4816]: I0316 00:09:34.667285 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 16 00:09:34 crc kubenswrapper[4816]: E0316 00:09:34.667692 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 16 00:09:34 crc kubenswrapper[4816]: I0316 00:09:34.667429 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 16 00:09:34 crc kubenswrapper[4816]: I0316 00:09:34.667382 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 16 00:09:34 crc kubenswrapper[4816]: E0316 00:09:34.667775 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 16 00:09:34 crc kubenswrapper[4816]: I0316 00:09:34.667432 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jqsjn" Mar 16 00:09:34 crc kubenswrapper[4816]: E0316 00:09:34.667996 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 16 00:09:34 crc kubenswrapper[4816]: E0316 00:09:34.668173 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jqsjn" podUID="84360ef9-0450-44c5-80eb-eab1bf8e808b" Mar 16 00:09:36 crc kubenswrapper[4816]: I0316 00:09:36.667202 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 16 00:09:36 crc kubenswrapper[4816]: I0316 00:09:36.667290 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jqsjn" Mar 16 00:09:36 crc kubenswrapper[4816]: E0316 00:09:36.667372 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 16 00:09:36 crc kubenswrapper[4816]: E0316 00:09:36.667498 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jqsjn" podUID="84360ef9-0450-44c5-80eb-eab1bf8e808b" Mar 16 00:09:36 crc kubenswrapper[4816]: I0316 00:09:36.667616 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 16 00:09:36 crc kubenswrapper[4816]: E0316 00:09:36.667703 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 16 00:09:36 crc kubenswrapper[4816]: I0316 00:09:36.667757 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 16 00:09:36 crc kubenswrapper[4816]: E0316 00:09:36.667837 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 16 00:09:37 crc kubenswrapper[4816]: I0316 00:09:37.690734 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c8786d1-025d-4788-bafe-c2c2eaf8e398\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:09:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:09:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://46db23088e8ba791d736f70a65bd066af80a5ec27c31332d9ac9c52530b21bfd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://da7b257af12b9ee605dc32a26d9565f866a9cafebb4936a0bde7f42ae9909f80\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2523ea43f2490afb81354be8a2840f54a3be1a8d3154196e0c8ac365d09723a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e7f3b9b1fc1648908bbf44f9ad32e1eb510b68336a8b60c8417df2f622a36e5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://29c4319e41bd025417b11cb87b682a66698ba7575801f247b1192dc8067ab66f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-16T00:07:59Z\\\",\\\"message\\\":\\\"le observer\\\\nW0316 00:07:59.373922 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0316 00:07:59.374075 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0316 00:07:59.374860 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2696044728/tls.crt::/tmp/serving-cert-2696044728/tls.key\\\\\\\"\\\\nI0316 00:07:59.560928 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0316 00:07:59.563996 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0316 00:07:59.564013 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0316 00:07:59.564035 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0316 00:07:59.564041 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0316 00:07:59.571475 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0316 00:07:59.571523 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0316 00:07:59.571531 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0316 00:07:59.571540 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0316 00:07:59.571574 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0316 00:07:59.571588 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nI0316 00:07:59.571493 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0316 00:07:59.571597 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0316 00:07:59.573484 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-16T00:07:58Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c81d0c40be9c3912864280cd98481f837ef29f69d85599b39decf5e9d7a97eff\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:50Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e39d816de31fa1c42a254ec61527496e7e85121f67007a774524167b1063cf1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e39d816de31fa1c42a254ec61527496e7e85121f67007a774524167b1063cf1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:06:47Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:37Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:37 crc kubenswrapper[4816]: I0316 00:09:37.712804 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://516852108b3d17a15676573ff080d3d94fa48d8076ed0404a8d827e715c2555e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:37Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:37 crc kubenswrapper[4816]: I0316 00:09:37.728847 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:37Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:37 crc kubenswrapper[4816]: I0316 00:09:37.746883 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-mt7bq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"03ef49f1-0c6a-443a-8df3-2db339c562ed\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d31459f93f54cc1cd99638b1f108b8997b34f246cf8c65e5462e672f04aaa7c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8979f751ec4797986924a3efea84d02bc1c1cf303e1e992f349c9083f2fe4f08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8979f751ec4797986924a3efea84d02bc1c1cf303e1e992f349c9083f2fe4f08\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa87691b0dc314209d88d1f4d5b2227abc1de527e8111831bb8243ba6192c300\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fa87691b0dc314209d88d1f4d5b2227abc1de527e8111831bb8243ba6192c300\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:08:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://33f82b993e3d729a5830e88136e74687c930d6ad1a15c64f003f311fda418bce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://33f82b993e3d729a5830e88136e74687c930d6ad1a15c64f003f311fda418bce\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:08:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b640c26797535db11051d50aefd13b7b2759cfa19d163f53764b0fe8710cefdc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b640c26797535db11051d50aefd13b7b2759cfa19d163f53764b0fe8710cefdc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:08:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe5359762a36b39c9f8ae939a1b9d094c42ffab59c4b9743a83b4391c171a512\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fe5359762a36b39c9f8ae939a1b9d094c42ffab59c4b9743a83b4391c171a512\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:08:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc31f057b776cf1925d265b64c7a9a9fb0993e4f3ffff530a48f28865e1c2954\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dc31f057b776cf1925d265b64c7a9a9fb0993e4f3ffff530a48f28865e1c2954\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:08:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:08:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-mt7bq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:37Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:37 crc kubenswrapper[4816]: I0316 00:09:37.758282 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-lhpbn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6ec6a8ee-efd9-45df-bb35-706fcc90ebe9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a34b33696ef85102800de34359b2f8b4bf11c03ca6347dde766456bcea20e0e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c2x8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:08:38Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-lhpbn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:37Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:37 crc kubenswrapper[4816]: I0316 00:09:37.775269 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7a695b85-75f5-42bb-a13c-aa20512355e4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b6e7e97dbf041ed59e094f9be34956cb28d4774022777f2371a2eb752937f551\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fea427361067bc1a9d56b7b6699b072b5cdeee8345bf6618b8d81c6848f62098\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-16T00:07:20Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0316 00:06:49.819775 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0316 00:06:49.823008 1 observer_polling.go:159] Starting file observer\\\\nI0316 00:06:49.860579 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0316 00:06:49.866006 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nF0316 00:07:20.134248 1 cmd.go:179] failed checking apiserver connectivity: Get \\\\\\\"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/openshift-kube-controller-manager/leases/cluster-policy-controller-lock\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:07:19Z is after 2026-02-23T05:33:13Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:49Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5f3dfc8f46079b51e52802920a734bf796a00db2cf42501b9f7e202e0e9bc2ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5afc1a568801d941c814fe5e4eb91df609672a9431082d578d600f68f669aa3f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c60304164aa2f8e740ab25d779ef9b0aa66f6acca476bae45a13f4be44e37e33\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:06:47Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:37Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:37 crc kubenswrapper[4816]: I0316 00:09:37.793112 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3a9d51fa-9450-4f18-be16-0a93d64889b4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://94c8e0e7b0ecd4d0f525f55538bff0b01653544be87c923fc152d4d982da7021\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://02c81b3d94044c1f4f344dd8306cacf8db533a662535ddb0a6a4ffc1bb4cf1f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://37dea82cd38ef9cfbda3626b789b21af2ada4b4a58bc2ecc6ce55eda60052277\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e44b0bbe4cebe4e52726f3f654f8b4a2953abcd0ad32db752c4641b3f0be1b00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e44b0bbe4cebe4e52726f3f654f8b4a2953abcd0ad32db752c4641b3f0be1b00\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:48Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:06:47Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:37Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:37 crc kubenswrapper[4816]: E0316 00:09:37.798366 4816 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 16 00:09:37 crc kubenswrapper[4816]: I0316 00:09:37.811430 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:37Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:37 crc kubenswrapper[4816]: I0316 00:09:37.827123 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:37Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:37 crc kubenswrapper[4816]: I0316 00:09:37.840841 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-jqsjn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"84360ef9-0450-44c5-80eb-eab1bf8e808b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxldb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxldb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:08:45Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-jqsjn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:37Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:37 crc kubenswrapper[4816]: I0316 00:09:37.859307 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"49e35ef6-7a6d-438f-b598-4445d73db379\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fbe90b49c5d42bb9d6d5d1b8f1d02b571403dd3f39d3118081351d72c4910914\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://508e6dbe6f2f1392d16501e09ee2ec523a3c2a245cd96c6ab773e26e83b8bb6a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://15b5a3223ff0fb5b96e5b87ee836add66498165759f4d6ed8cb6413f091967e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://603bdcbd4a6af5fda7ca09e4823e0db8d7ec3e3eb38eddf9f062a14b433cb094\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8b061f16a65a27b9a5ef66231376b70fec084479a991b7fdd430957df76da32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b19c46be8bb0c6604bb4f702d14472b782280ce716a3b5fe87a4fa4f1b15b174\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b19c46be8bb0c6604bb4f702d14472b782280ce716a3b5fe87a4fa4f1b15b174\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://baceacd0418c0bbcbd2e875008286443b263336ba956d8e4584147ed949b3f69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://baceacd0418c0bbcbd2e875008286443b263336ba956d8e4584147ed949b3f69\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:49Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://24e7adc2ddce8a281cd270db47b3528d2a4965ab6d41a14ecb5aaea02daf0c45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://24e7adc2ddce8a281cd270db47b3528d2a4965ab6d41a14ecb5aaea02daf0c45\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:06:47Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:37Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:37 crc kubenswrapper[4816]: I0316 00:09:37.879842 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-szscw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e9789e58-12c8-4831-9401-af48a3e92209\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:09:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:09:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b45d6058e72f7117fdfb86b3480de77c8592dba7dcd7932b2a5c34036da7af26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e6f0b7e8f95bbbf3b90133349b8b13d2784582a5d95dae8dd159328b570ae962\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-16T00:09:18Z\\\",\\\"message\\\":\\\"2026-03-16T00:08:33+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_c9e0d937-8428-483b-a1c0-f9ef411a7cbe\\\\n2026-03-16T00:08:33+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_c9e0d937-8428-483b-a1c0-f9ef411a7cbe to /host/opt/cni/bin/\\\\n2026-03-16T00:08:33Z [verbose] multus-daemon started\\\\n2026-03-16T00:08:33Z [verbose] Readiness Indicator file check\\\\n2026-03-16T00:09:18Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:31Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:09:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mxf6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:08:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-szscw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:37Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:37 crc kubenswrapper[4816]: I0316 00:09:37.898037 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-jrdcz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dd08ece2-7636-4966-973a-e96a34b70b53\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6dd22d6548986bf126a8ce6b30c7c11d8c37bd0c03539a55cc28fb7debcfc26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trwbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7003a4592a48f87ab63e9f5637c7665040f9784a7c8f599c82ed61270ddb793c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trwbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:08:31Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-jrdcz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:37Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:37 crc kubenswrapper[4816]: I0316 00:09:37.934099 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-psjs7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f47b6c54d5d72d827208f4d8228be260d9d19b4a720b3174349288dee2916641\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e0a510814d60106574e3cd3e612c031ac900caa0f24a12a62d7cd1f6bfda1ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://826495d8ab8f414169bf36159278da0a040fec699a74b4aebdc8f87ccdc48bd6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa5f052d32ad8f52f33ee5baea5af98a64f05b831b86ebced6c9422fb4845fcf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://86c28f1ae2f2fb61b7755ac14bc7f5346036735c75daccaa0e31b4e606cfb4c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0fac0a986edf984b0e34eda0f969205d5bc92f23d0edacb3c6dd8721eaed2fb9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba7195b779774df29cc722244d9b4290e7f75033f6e135e41288fbb5310a7c3c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ba7195b779774df29cc722244d9b4290e7f75033f6e135e41288fbb5310a7c3c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-16T00:09:25Z\\\",\\\"message\\\":\\\":[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {960d98b2-dc64-4e93-a4b6-9b19847af71e}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0316 00:09:25.816315 7422 services_controller.go:443] Built service openshift-etcd-operator/metrics LB cluster-wide configs for network=default: []services.lbConfig{services.lbConfig{vips:[]string{\\\\\\\"10.217.5.188\\\\\\\"}, protocol:\\\\\\\"TCP\\\\\\\", inport:443, clusterEndpoints:services.lbEndpoints{Port:0, V4IPs:[]string(nil), V6IPs:[]string(nil)}, nodeEndpoints:map[string]services.lbEndpoints{}, externalTrafficLocal:false, internalTrafficLocal:false, hasNodePort:false}}\\\\nI0316 00:09:25.816331 7422 default_network_controller.go:776] Recording success event on pod openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-kbgw5\\\\nI0316 00:09:25.816342 7422 services_controller.go:444] Built service openshift-etcd-operator/metrics LB per-node configs for network=default: []services.lbConfig(nil)\\\\nI0316 00:09:25.815541 7422 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-diagnostics/network-check-target-xd92c\\\\nI0316 00:09:25.815313 7422 services_controller.go:443] Built service openshift-machine-config-operator/machine-config-controller LB cluster-wide configs for network=default: []services.lbConfig{services.lbConfig{vips:[]string{\\\\\\\"10.217.5.16\\\\\\\"}, protocol:\\\\\\\"TCP\\\\\\\", inport:9001, clus\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-16T00:09:24Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-psjs7_openshift-ovn-kubernetes(2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4d6bf581280690c99ddbbecd4cf4e215a12230d377978aa48acd3c05b85a3c56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1249f8b1c01db98230c10bde49bdb68f17caccc1ba0546e462df4384ca1658dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1249f8b1c01db98230c10bde49bdb68f17caccc1ba0546e462df4384ca1658dd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:08:31Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-psjs7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:37Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:37 crc kubenswrapper[4816]: I0316 00:09:37.950234 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"580d9565-d55c-4108-af2d-07ab6fa7ea2e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f2a41f624c9d40a637cf958648b795587652d7139d4b7cea2e8c70d1cdf05dd6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://59fcd3602ca5395b4f9b934768cad48d5504480af46a1fff8203eb46415b8769\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://59fcd3602ca5395b4f9b934768cad48d5504480af46a1fff8203eb46415b8769\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:06:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:37Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:37 crc kubenswrapper[4816]: I0316 00:09:37.971616 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://723654bc91ce4d40a27b14a9c4df1ed6c8dff2517f378927d7a6a96aeaa6951b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:37Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:37 crc kubenswrapper[4816]: I0316 00:09:37.995724 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://37869baff974556c22aadadfa4b528b5d6b9ad236107b4dbc945b3cdbf743f20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f9fe727708fdaca68596b7d305629ade45b061a88a81085f885d490ab99e24b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:37Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:38 crc kubenswrapper[4816]: I0316 00:09:38.011905 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-cnhkf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e686cd4-bddf-463e-b471-e49ea862691e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f6c3cacd4d7f6866e66b7234b280540ef2c443531de7a11f6894323ef24a9fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bwzt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:08:31Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-cnhkf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:38Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:38 crc kubenswrapper[4816]: I0316 00:09:38.027754 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-kbgw5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7b28986d-e33b-4876-ab6d-64d69960fb8b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://544bab5013b513f755dd9fa88cbc71021b0cfe823aa7508af994c0b1342c3646\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zgr7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0beb1aba8b8813abd98809200fa4dd348377427596c2935ceb012361634df2b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zgr7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:08:44Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-kbgw5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:38Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:38 crc kubenswrapper[4816]: I0316 00:09:38.667717 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 16 00:09:38 crc kubenswrapper[4816]: I0316 00:09:38.667785 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 16 00:09:38 crc kubenswrapper[4816]: I0316 00:09:38.667809 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 16 00:09:38 crc kubenswrapper[4816]: I0316 00:09:38.667717 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jqsjn" Mar 16 00:09:38 crc kubenswrapper[4816]: E0316 00:09:38.668006 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 16 00:09:38 crc kubenswrapper[4816]: E0316 00:09:38.668144 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jqsjn" podUID="84360ef9-0450-44c5-80eb-eab1bf8e808b" Mar 16 00:09:38 crc kubenswrapper[4816]: E0316 00:09:38.668290 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 16 00:09:38 crc kubenswrapper[4816]: E0316 00:09:38.668405 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 16 00:09:39 crc kubenswrapper[4816]: I0316 00:09:39.668355 4816 scope.go:117] "RemoveContainer" containerID="ba7195b779774df29cc722244d9b4290e7f75033f6e135e41288fbb5310a7c3c" Mar 16 00:09:39 crc kubenswrapper[4816]: E0316 00:09:39.668542 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-psjs7_openshift-ovn-kubernetes(2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88)\"" pod="openshift-ovn-kubernetes/ovnkube-node-psjs7" podUID="2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88" Mar 16 00:09:40 crc kubenswrapper[4816]: I0316 00:09:40.667056 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 16 00:09:40 crc kubenswrapper[4816]: I0316 00:09:40.667181 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jqsjn" Mar 16 00:09:40 crc kubenswrapper[4816]: E0316 00:09:40.667292 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 16 00:09:40 crc kubenswrapper[4816]: I0316 00:09:40.667346 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 16 00:09:40 crc kubenswrapper[4816]: E0316 00:09:40.667433 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jqsjn" podUID="84360ef9-0450-44c5-80eb-eab1bf8e808b" Mar 16 00:09:40 crc kubenswrapper[4816]: I0316 00:09:40.667785 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 16 00:09:40 crc kubenswrapper[4816]: E0316 00:09:40.667738 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 16 00:09:40 crc kubenswrapper[4816]: E0316 00:09:40.667979 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 16 00:09:41 crc kubenswrapper[4816]: I0316 00:09:41.129203 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:09:41 crc kubenswrapper[4816]: I0316 00:09:41.129292 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:09:41 crc kubenswrapper[4816]: I0316 00:09:41.129311 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:09:41 crc kubenswrapper[4816]: I0316 00:09:41.129385 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:09:41 crc kubenswrapper[4816]: I0316 00:09:41.129405 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:09:41Z","lastTransitionTime":"2026-03-16T00:09:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:09:41 crc kubenswrapper[4816]: E0316 00:09:41.151107 4816 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:09:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:09:41Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:09:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:09:41Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:09:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:09:41Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:09:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:09:41Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d8fb348c-4907-4c8f-859a-735976530e03\\\",\\\"systemUUID\\\":\\\"e97abf98-298f-4589-a70a-4cfb5cb2994a\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:41Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:41 crc kubenswrapper[4816]: I0316 00:09:41.156970 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:09:41 crc kubenswrapper[4816]: I0316 00:09:41.157029 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:09:41 crc kubenswrapper[4816]: I0316 00:09:41.157048 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:09:41 crc kubenswrapper[4816]: I0316 00:09:41.157122 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:09:41 crc kubenswrapper[4816]: I0316 00:09:41.157143 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:09:41Z","lastTransitionTime":"2026-03-16T00:09:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:09:41 crc kubenswrapper[4816]: E0316 00:09:41.175799 4816 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:09:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:09:41Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:09:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:09:41Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:09:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:09:41Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:09:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:09:41Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d8fb348c-4907-4c8f-859a-735976530e03\\\",\\\"systemUUID\\\":\\\"e97abf98-298f-4589-a70a-4cfb5cb2994a\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:41Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:41 crc kubenswrapper[4816]: I0316 00:09:41.180624 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:09:41 crc kubenswrapper[4816]: I0316 00:09:41.180720 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:09:41 crc kubenswrapper[4816]: I0316 00:09:41.180758 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:09:41 crc kubenswrapper[4816]: I0316 00:09:41.180791 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:09:41 crc kubenswrapper[4816]: I0316 00:09:41.180815 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:09:41Z","lastTransitionTime":"2026-03-16T00:09:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:09:41 crc kubenswrapper[4816]: E0316 00:09:41.199668 4816 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:09:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:09:41Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:09:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:09:41Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:09:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:09:41Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:09:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:09:41Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d8fb348c-4907-4c8f-859a-735976530e03\\\",\\\"systemUUID\\\":\\\"e97abf98-298f-4589-a70a-4cfb5cb2994a\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:41Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:41 crc kubenswrapper[4816]: I0316 00:09:41.205017 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:09:41 crc kubenswrapper[4816]: I0316 00:09:41.205077 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:09:41 crc kubenswrapper[4816]: I0316 00:09:41.205096 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:09:41 crc kubenswrapper[4816]: I0316 00:09:41.205123 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:09:41 crc kubenswrapper[4816]: I0316 00:09:41.205143 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:09:41Z","lastTransitionTime":"2026-03-16T00:09:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:09:41 crc kubenswrapper[4816]: E0316 00:09:41.223915 4816 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:09:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:09:41Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:09:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:09:41Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:09:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:09:41Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:09:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:09:41Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d8fb348c-4907-4c8f-859a-735976530e03\\\",\\\"systemUUID\\\":\\\"e97abf98-298f-4589-a70a-4cfb5cb2994a\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:41Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:41 crc kubenswrapper[4816]: I0316 00:09:41.228379 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:09:41 crc kubenswrapper[4816]: I0316 00:09:41.228421 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:09:41 crc kubenswrapper[4816]: I0316 00:09:41.228433 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:09:41 crc kubenswrapper[4816]: I0316 00:09:41.228458 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:09:41 crc kubenswrapper[4816]: I0316 00:09:41.228472 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:09:41Z","lastTransitionTime":"2026-03-16T00:09:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:09:41 crc kubenswrapper[4816]: E0316 00:09:41.245928 4816 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:09:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:09:41Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:09:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:09:41Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:09:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:09:41Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:09:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:09:41Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d8fb348c-4907-4c8f-859a-735976530e03\\\",\\\"systemUUID\\\":\\\"e97abf98-298f-4589-a70a-4cfb5cb2994a\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:41Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:41 crc kubenswrapper[4816]: E0316 00:09:41.246175 4816 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 16 00:09:42 crc kubenswrapper[4816]: I0316 00:09:42.666717 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 16 00:09:42 crc kubenswrapper[4816]: I0316 00:09:42.666782 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 16 00:09:42 crc kubenswrapper[4816]: I0316 00:09:42.666731 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 16 00:09:42 crc kubenswrapper[4816]: I0316 00:09:42.666726 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jqsjn" Mar 16 00:09:42 crc kubenswrapper[4816]: E0316 00:09:42.667230 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 16 00:09:42 crc kubenswrapper[4816]: E0316 00:09:42.667049 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 16 00:09:42 crc kubenswrapper[4816]: E0316 00:09:42.667425 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 16 00:09:42 crc kubenswrapper[4816]: E0316 00:09:42.667928 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jqsjn" podUID="84360ef9-0450-44c5-80eb-eab1bf8e808b" Mar 16 00:09:42 crc kubenswrapper[4816]: E0316 00:09:42.800623 4816 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 16 00:09:44 crc kubenswrapper[4816]: I0316 00:09:44.667675 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jqsjn" Mar 16 00:09:44 crc kubenswrapper[4816]: I0316 00:09:44.667769 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 16 00:09:44 crc kubenswrapper[4816]: I0316 00:09:44.667810 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 16 00:09:44 crc kubenswrapper[4816]: E0316 00:09:44.668065 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jqsjn" podUID="84360ef9-0450-44c5-80eb-eab1bf8e808b" Mar 16 00:09:44 crc kubenswrapper[4816]: I0316 00:09:44.668136 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 16 00:09:44 crc kubenswrapper[4816]: E0316 00:09:44.668282 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 16 00:09:44 crc kubenswrapper[4816]: E0316 00:09:44.668400 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 16 00:09:44 crc kubenswrapper[4816]: E0316 00:09:44.668908 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 16 00:09:46 crc kubenswrapper[4816]: I0316 00:09:46.667005 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 16 00:09:46 crc kubenswrapper[4816]: I0316 00:09:46.667080 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 16 00:09:46 crc kubenswrapper[4816]: E0316 00:09:46.667236 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 16 00:09:46 crc kubenswrapper[4816]: I0316 00:09:46.667294 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jqsjn" Mar 16 00:09:46 crc kubenswrapper[4816]: I0316 00:09:46.667342 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 16 00:09:46 crc kubenswrapper[4816]: E0316 00:09:46.667536 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 16 00:09:46 crc kubenswrapper[4816]: E0316 00:09:46.667679 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jqsjn" podUID="84360ef9-0450-44c5-80eb-eab1bf8e808b" Mar 16 00:09:46 crc kubenswrapper[4816]: E0316 00:09:46.667763 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 16 00:09:47 crc kubenswrapper[4816]: I0316 00:09:47.702111 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"49e35ef6-7a6d-438f-b598-4445d73db379\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fbe90b49c5d42bb9d6d5d1b8f1d02b571403dd3f39d3118081351d72c4910914\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://508e6dbe6f2f1392d16501e09ee2ec523a3c2a245cd96c6ab773e26e83b8bb6a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://15b5a3223ff0fb5b96e5b87ee836add66498165759f4d6ed8cb6413f091967e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://603bdcbd4a6af5fda7ca09e4823e0db8d7ec3e3eb38eddf9f062a14b433cb094\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8b061f16a65a27b9a5ef66231376b70fec084479a991b7fdd430957df76da32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b19c46be8bb0c6604bb4f702d14472b782280ce716a3b5fe87a4fa4f1b15b174\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b19c46be8bb0c6604bb4f702d14472b782280ce716a3b5fe87a4fa4f1b15b174\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://baceacd0418c0bbcbd2e875008286443b263336ba956d8e4584147ed949b3f69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://baceacd0418c0bbcbd2e875008286443b263336ba956d8e4584147ed949b3f69\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:49Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://24e7adc2ddce8a281cd270db47b3528d2a4965ab6d41a14ecb5aaea02daf0c45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://24e7adc2ddce8a281cd270db47b3528d2a4965ab6d41a14ecb5aaea02daf0c45\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:06:47Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:47Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:47 crc kubenswrapper[4816]: I0316 00:09:47.725893 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-szscw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e9789e58-12c8-4831-9401-af48a3e92209\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:09:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:09:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b45d6058e72f7117fdfb86b3480de77c8592dba7dcd7932b2a5c34036da7af26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e6f0b7e8f95bbbf3b90133349b8b13d2784582a5d95dae8dd159328b570ae962\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-16T00:09:18Z\\\",\\\"message\\\":\\\"2026-03-16T00:08:33+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_c9e0d937-8428-483b-a1c0-f9ef411a7cbe\\\\n2026-03-16T00:08:33+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_c9e0d937-8428-483b-a1c0-f9ef411a7cbe to /host/opt/cni/bin/\\\\n2026-03-16T00:08:33Z [verbose] multus-daemon started\\\\n2026-03-16T00:08:33Z [verbose] Readiness Indicator file check\\\\n2026-03-16T00:09:18Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:31Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:09:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mxf6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:08:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-szscw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:47Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:47 crc kubenswrapper[4816]: I0316 00:09:47.744387 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-jrdcz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dd08ece2-7636-4966-973a-e96a34b70b53\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6dd22d6548986bf126a8ce6b30c7c11d8c37bd0c03539a55cc28fb7debcfc26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trwbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7003a4592a48f87ab63e9f5637c7665040f9784a7c8f599c82ed61270ddb793c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trwbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:08:31Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-jrdcz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:47Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:47 crc kubenswrapper[4816]: I0316 00:09:47.784076 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-psjs7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f47b6c54d5d72d827208f4d8228be260d9d19b4a720b3174349288dee2916641\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e0a510814d60106574e3cd3e612c031ac900caa0f24a12a62d7cd1f6bfda1ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://826495d8ab8f414169bf36159278da0a040fec699a74b4aebdc8f87ccdc48bd6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa5f052d32ad8f52f33ee5baea5af98a64f05b831b86ebced6c9422fb4845fcf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://86c28f1ae2f2fb61b7755ac14bc7f5346036735c75daccaa0e31b4e606cfb4c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0fac0a986edf984b0e34eda0f969205d5bc92f23d0edacb3c6dd8721eaed2fb9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba7195b779774df29cc722244d9b4290e7f75033f6e135e41288fbb5310a7c3c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ba7195b779774df29cc722244d9b4290e7f75033f6e135e41288fbb5310a7c3c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-16T00:09:25Z\\\",\\\"message\\\":\\\":[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {960d98b2-dc64-4e93-a4b6-9b19847af71e}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0316 00:09:25.816315 7422 services_controller.go:443] Built service openshift-etcd-operator/metrics LB cluster-wide configs for network=default: []services.lbConfig{services.lbConfig{vips:[]string{\\\\\\\"10.217.5.188\\\\\\\"}, protocol:\\\\\\\"TCP\\\\\\\", inport:443, clusterEndpoints:services.lbEndpoints{Port:0, V4IPs:[]string(nil), V6IPs:[]string(nil)}, nodeEndpoints:map[string]services.lbEndpoints{}, externalTrafficLocal:false, internalTrafficLocal:false, hasNodePort:false}}\\\\nI0316 00:09:25.816331 7422 default_network_controller.go:776] Recording success event on pod openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-kbgw5\\\\nI0316 00:09:25.816342 7422 services_controller.go:444] Built service openshift-etcd-operator/metrics LB per-node configs for network=default: []services.lbConfig(nil)\\\\nI0316 00:09:25.815541 7422 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-diagnostics/network-check-target-xd92c\\\\nI0316 00:09:25.815313 7422 services_controller.go:443] Built service openshift-machine-config-operator/machine-config-controller LB cluster-wide configs for network=default: []services.lbConfig{services.lbConfig{vips:[]string{\\\\\\\"10.217.5.16\\\\\\\"}, protocol:\\\\\\\"TCP\\\\\\\", inport:9001, clus\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-16T00:09:24Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-psjs7_openshift-ovn-kubernetes(2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4d6bf581280690c99ddbbecd4cf4e215a12230d377978aa48acd3c05b85a3c56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1249f8b1c01db98230c10bde49bdb68f17caccc1ba0546e462df4384ca1658dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1249f8b1c01db98230c10bde49bdb68f17caccc1ba0546e462df4384ca1658dd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rd68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:08:31Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-psjs7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:47Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:47 crc kubenswrapper[4816]: E0316 00:09:47.801620 4816 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 16 00:09:47 crc kubenswrapper[4816]: I0316 00:09:47.805434 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"580d9565-d55c-4108-af2d-07ab6fa7ea2e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f2a41f624c9d40a637cf958648b795587652d7139d4b7cea2e8c70d1cdf05dd6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://59fcd3602ca5395b4f9b934768cad48d5504480af46a1fff8203eb46415b8769\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://59fcd3602ca5395b4f9b934768cad48d5504480af46a1fff8203eb46415b8769\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:06:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:47Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:47 crc kubenswrapper[4816]: I0316 00:09:47.833963 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://723654bc91ce4d40a27b14a9c4df1ed6c8dff2517f378927d7a6a96aeaa6951b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:47Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:47 crc kubenswrapper[4816]: I0316 00:09:47.856602 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://37869baff974556c22aadadfa4b528b5d6b9ad236107b4dbc945b3cdbf743f20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f9fe727708fdaca68596b7d305629ade45b061a88a81085f885d490ab99e24b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:47Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:47 crc kubenswrapper[4816]: I0316 00:09:47.877336 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-cnhkf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e686cd4-bddf-463e-b471-e49ea862691e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f6c3cacd4d7f6866e66b7234b280540ef2c443531de7a11f6894323ef24a9fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bwzt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:08:31Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-cnhkf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:47Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:47 crc kubenswrapper[4816]: I0316 00:09:47.897353 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-kbgw5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7b28986d-e33b-4876-ab6d-64d69960fb8b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://544bab5013b513f755dd9fa88cbc71021b0cfe823aa7508af994c0b1342c3646\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zgr7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0beb1aba8b8813abd98809200fa4dd348377427596c2935ceb012361634df2b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zgr7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:08:44Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-kbgw5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:47Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:47 crc kubenswrapper[4816]: I0316 00:09:47.923267 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c8786d1-025d-4788-bafe-c2c2eaf8e398\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:09:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:09:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://46db23088e8ba791d736f70a65bd066af80a5ec27c31332d9ac9c52530b21bfd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://da7b257af12b9ee605dc32a26d9565f866a9cafebb4936a0bde7f42ae9909f80\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2523ea43f2490afb81354be8a2840f54a3be1a8d3154196e0c8ac365d09723a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e7f3b9b1fc1648908bbf44f9ad32e1eb510b68336a8b60c8417df2f622a36e5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://29c4319e41bd025417b11cb87b682a66698ba7575801f247b1192dc8067ab66f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-16T00:07:59Z\\\",\\\"message\\\":\\\"le observer\\\\nW0316 00:07:59.373922 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0316 00:07:59.374075 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0316 00:07:59.374860 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2696044728/tls.crt::/tmp/serving-cert-2696044728/tls.key\\\\\\\"\\\\nI0316 00:07:59.560928 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0316 00:07:59.563996 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0316 00:07:59.564013 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0316 00:07:59.564035 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0316 00:07:59.564041 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0316 00:07:59.571475 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0316 00:07:59.571523 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0316 00:07:59.571531 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0316 00:07:59.571540 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0316 00:07:59.571574 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0316 00:07:59.571588 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nI0316 00:07:59.571493 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0316 00:07:59.571597 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0316 00:07:59.573484 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-16T00:07:58Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c81d0c40be9c3912864280cd98481f837ef29f69d85599b39decf5e9d7a97eff\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:50Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e39d816de31fa1c42a254ec61527496e7e85121f67007a774524167b1063cf1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e39d816de31fa1c42a254ec61527496e7e85121f67007a774524167b1063cf1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:06:47Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:47Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:47 crc kubenswrapper[4816]: I0316 00:09:47.940282 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://516852108b3d17a15676573ff080d3d94fa48d8076ed0404a8d827e715c2555e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:47Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:47 crc kubenswrapper[4816]: I0316 00:09:47.958802 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:47Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:47 crc kubenswrapper[4816]: I0316 00:09:47.980129 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-mt7bq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"03ef49f1-0c6a-443a-8df3-2db339c562ed\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d31459f93f54cc1cd99638b1f108b8997b34f246cf8c65e5462e672f04aaa7c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8979f751ec4797986924a3efea84d02bc1c1cf303e1e992f349c9083f2fe4f08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8979f751ec4797986924a3efea84d02bc1c1cf303e1e992f349c9083f2fe4f08\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:08:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa87691b0dc314209d88d1f4d5b2227abc1de527e8111831bb8243ba6192c300\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fa87691b0dc314209d88d1f4d5b2227abc1de527e8111831bb8243ba6192c300\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:08:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://33f82b993e3d729a5830e88136e74687c930d6ad1a15c64f003f311fda418bce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://33f82b993e3d729a5830e88136e74687c930d6ad1a15c64f003f311fda418bce\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:08:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b640c26797535db11051d50aefd13b7b2759cfa19d163f53764b0fe8710cefdc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b640c26797535db11051d50aefd13b7b2759cfa19d163f53764b0fe8710cefdc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:08:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe5359762a36b39c9f8ae939a1b9d094c42ffab59c4b9743a83b4391c171a512\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fe5359762a36b39c9f8ae939a1b9d094c42ffab59c4b9743a83b4391c171a512\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:08:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc31f057b776cf1925d265b64c7a9a9fb0993e4f3ffff530a48f28865e1c2954\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dc31f057b776cf1925d265b64c7a9a9fb0993e4f3ffff530a48f28865e1c2954\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:08:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfvdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:08:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-mt7bq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:47Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:47 crc kubenswrapper[4816]: I0316 00:09:47.997712 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-lhpbn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6ec6a8ee-efd9-45df-bb35-706fcc90ebe9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a34b33696ef85102800de34359b2f8b4bf11c03ca6347dde766456bcea20e0e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c2x8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:08:38Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-lhpbn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:47Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:48 crc kubenswrapper[4816]: I0316 00:09:48.019205 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7a695b85-75f5-42bb-a13c-aa20512355e4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b6e7e97dbf041ed59e094f9be34956cb28d4774022777f2371a2eb752937f551\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fea427361067bc1a9d56b7b6699b072b5cdeee8345bf6618b8d81c6848f62098\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-16T00:07:20Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0316 00:06:49.819775 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0316 00:06:49.823008 1 observer_polling.go:159] Starting file observer\\\\nI0316 00:06:49.860579 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0316 00:06:49.866006 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nF0316 00:07:20.134248 1 cmd.go:179] failed checking apiserver connectivity: Get \\\\\\\"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/openshift-kube-controller-manager/leases/cluster-policy-controller-lock\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:07:19Z is after 2026-02-23T05:33:13Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:49Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5f3dfc8f46079b51e52802920a734bf796a00db2cf42501b9f7e202e0e9bc2ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5afc1a568801d941c814fe5e4eb91df609672a9431082d578d600f68f669aa3f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c60304164aa2f8e740ab25d779ef9b0aa66f6acca476bae45a13f4be44e37e33\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:06:47Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:48Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:48 crc kubenswrapper[4816]: I0316 00:09:48.040047 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3a9d51fa-9450-4f18-be16-0a93d64889b4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://94c8e0e7b0ecd4d0f525f55538bff0b01653544be87c923fc152d4d982da7021\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://02c81b3d94044c1f4f344dd8306cacf8db533a662535ddb0a6a4ffc1bb4cf1f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://37dea82cd38ef9cfbda3626b789b21af2ada4b4a58bc2ecc6ce55eda60052277\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e44b0bbe4cebe4e52726f3f654f8b4a2953abcd0ad32db752c4641b3f0be1b00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e44b0bbe4cebe4e52726f3f654f8b4a2953abcd0ad32db752c4641b3f0be1b00\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:48Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:06:47Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:48Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:48 crc kubenswrapper[4816]: I0316 00:09:48.058655 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:48Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:48 crc kubenswrapper[4816]: I0316 00:09:48.075992 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:48Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:48 crc kubenswrapper[4816]: I0316 00:09:48.090696 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-jqsjn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"84360ef9-0450-44c5-80eb-eab1bf8e808b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxldb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxldb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:08:45Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-jqsjn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:48Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:48 crc kubenswrapper[4816]: I0316 00:09:48.667418 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 16 00:09:48 crc kubenswrapper[4816]: I0316 00:09:48.667428 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 16 00:09:48 crc kubenswrapper[4816]: E0316 00:09:48.667628 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 16 00:09:48 crc kubenswrapper[4816]: E0316 00:09:48.667740 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 16 00:09:48 crc kubenswrapper[4816]: I0316 00:09:48.667344 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 16 00:09:48 crc kubenswrapper[4816]: E0316 00:09:48.668146 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 16 00:09:48 crc kubenswrapper[4816]: I0316 00:09:48.668368 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jqsjn" Mar 16 00:09:48 crc kubenswrapper[4816]: E0316 00:09:48.668664 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jqsjn" podUID="84360ef9-0450-44c5-80eb-eab1bf8e808b" Mar 16 00:09:49 crc kubenswrapper[4816]: I0316 00:09:49.225449 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/84360ef9-0450-44c5-80eb-eab1bf8e808b-metrics-certs\") pod \"network-metrics-daemon-jqsjn\" (UID: \"84360ef9-0450-44c5-80eb-eab1bf8e808b\") " pod="openshift-multus/network-metrics-daemon-jqsjn" Mar 16 00:09:49 crc kubenswrapper[4816]: E0316 00:09:49.225637 4816 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 16 00:09:49 crc kubenswrapper[4816]: E0316 00:09:49.225689 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/84360ef9-0450-44c5-80eb-eab1bf8e808b-metrics-certs podName:84360ef9-0450-44c5-80eb-eab1bf8e808b nodeName:}" failed. No retries permitted until 2026-03-16 00:10:53.225674012 +0000 UTC m=+246.321973975 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/84360ef9-0450-44c5-80eb-eab1bf8e808b-metrics-certs") pod "network-metrics-daemon-jqsjn" (UID: "84360ef9-0450-44c5-80eb-eab1bf8e808b") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 16 00:09:50 crc kubenswrapper[4816]: I0316 00:09:50.666890 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jqsjn" Mar 16 00:09:50 crc kubenswrapper[4816]: I0316 00:09:50.666886 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 16 00:09:50 crc kubenswrapper[4816]: I0316 00:09:50.666895 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 16 00:09:50 crc kubenswrapper[4816]: I0316 00:09:50.666776 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 16 00:09:50 crc kubenswrapper[4816]: E0316 00:09:50.667491 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jqsjn" podUID="84360ef9-0450-44c5-80eb-eab1bf8e808b" Mar 16 00:09:50 crc kubenswrapper[4816]: E0316 00:09:50.667714 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 16 00:09:50 crc kubenswrapper[4816]: E0316 00:09:50.667844 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 16 00:09:50 crc kubenswrapper[4816]: E0316 00:09:50.667911 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 16 00:09:50 crc kubenswrapper[4816]: I0316 00:09:50.669056 4816 scope.go:117] "RemoveContainer" containerID="ba7195b779774df29cc722244d9b4290e7f75033f6e135e41288fbb5310a7c3c" Mar 16 00:09:50 crc kubenswrapper[4816]: E0316 00:09:50.669324 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-psjs7_openshift-ovn-kubernetes(2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88)\"" pod="openshift-ovn-kubernetes/ovnkube-node-psjs7" podUID="2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88" Mar 16 00:09:51 crc kubenswrapper[4816]: I0316 00:09:51.259520 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:09:51 crc kubenswrapper[4816]: I0316 00:09:51.259605 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:09:51 crc kubenswrapper[4816]: I0316 00:09:51.259630 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:09:51 crc kubenswrapper[4816]: I0316 00:09:51.259654 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:09:51 crc kubenswrapper[4816]: I0316 00:09:51.259670 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:09:51Z","lastTransitionTime":"2026-03-16T00:09:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:09:51 crc kubenswrapper[4816]: E0316 00:09:51.281183 4816 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:09:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:09:51Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:09:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:09:51Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:09:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:09:51Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:09:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:09:51Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d8fb348c-4907-4c8f-859a-735976530e03\\\",\\\"systemUUID\\\":\\\"e97abf98-298f-4589-a70a-4cfb5cb2994a\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:51Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:51 crc kubenswrapper[4816]: I0316 00:09:51.286906 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:09:51 crc kubenswrapper[4816]: I0316 00:09:51.286979 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:09:51 crc kubenswrapper[4816]: I0316 00:09:51.286998 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:09:51 crc kubenswrapper[4816]: I0316 00:09:51.287025 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:09:51 crc kubenswrapper[4816]: I0316 00:09:51.287044 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:09:51Z","lastTransitionTime":"2026-03-16T00:09:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:09:51 crc kubenswrapper[4816]: E0316 00:09:51.306503 4816 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:09:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:09:51Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:09:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:09:51Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:09:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:09:51Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:09:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:09:51Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d8fb348c-4907-4c8f-859a-735976530e03\\\",\\\"systemUUID\\\":\\\"e97abf98-298f-4589-a70a-4cfb5cb2994a\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:51Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:51 crc kubenswrapper[4816]: I0316 00:09:51.311272 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:09:51 crc kubenswrapper[4816]: I0316 00:09:51.311328 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:09:51 crc kubenswrapper[4816]: I0316 00:09:51.311357 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:09:51 crc kubenswrapper[4816]: I0316 00:09:51.311389 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:09:51 crc kubenswrapper[4816]: I0316 00:09:51.311412 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:09:51Z","lastTransitionTime":"2026-03-16T00:09:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:09:51 crc kubenswrapper[4816]: E0316 00:09:51.330307 4816 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:09:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:09:51Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:09:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:09:51Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:09:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:09:51Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:09:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:09:51Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d8fb348c-4907-4c8f-859a-735976530e03\\\",\\\"systemUUID\\\":\\\"e97abf98-298f-4589-a70a-4cfb5cb2994a\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:51Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:51 crc kubenswrapper[4816]: I0316 00:09:51.335940 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:09:51 crc kubenswrapper[4816]: I0316 00:09:51.335990 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:09:51 crc kubenswrapper[4816]: I0316 00:09:51.336007 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:09:51 crc kubenswrapper[4816]: I0316 00:09:51.336030 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:09:51 crc kubenswrapper[4816]: I0316 00:09:51.336046 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:09:51Z","lastTransitionTime":"2026-03-16T00:09:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:09:51 crc kubenswrapper[4816]: E0316 00:09:51.354623 4816 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:09:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:09:51Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:09:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:09:51Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:09:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:09:51Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:09:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:09:51Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d8fb348c-4907-4c8f-859a-735976530e03\\\",\\\"systemUUID\\\":\\\"e97abf98-298f-4589-a70a-4cfb5cb2994a\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:51Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:51 crc kubenswrapper[4816]: I0316 00:09:51.360657 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:09:51 crc kubenswrapper[4816]: I0316 00:09:51.360770 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:09:51 crc kubenswrapper[4816]: I0316 00:09:51.360793 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:09:51 crc kubenswrapper[4816]: I0316 00:09:51.360827 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:09:51 crc kubenswrapper[4816]: I0316 00:09:51.360849 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:09:51Z","lastTransitionTime":"2026-03-16T00:09:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:09:51 crc kubenswrapper[4816]: E0316 00:09:51.383287 4816 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:09:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:09:51Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:09:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:09:51Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:09:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:09:51Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:09:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:09:51Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d8fb348c-4907-4c8f-859a-735976530e03\\\",\\\"systemUUID\\\":\\\"e97abf98-298f-4589-a70a-4cfb5cb2994a\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:51Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:51 crc kubenswrapper[4816]: E0316 00:09:51.383533 4816 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 16 00:09:52 crc kubenswrapper[4816]: I0316 00:09:52.666892 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 16 00:09:52 crc kubenswrapper[4816]: I0316 00:09:52.666951 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jqsjn" Mar 16 00:09:52 crc kubenswrapper[4816]: I0316 00:09:52.667024 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 16 00:09:52 crc kubenswrapper[4816]: E0316 00:09:52.667138 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 16 00:09:52 crc kubenswrapper[4816]: I0316 00:09:52.667251 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 16 00:09:52 crc kubenswrapper[4816]: E0316 00:09:52.667375 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 16 00:09:52 crc kubenswrapper[4816]: E0316 00:09:52.667477 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 16 00:09:52 crc kubenswrapper[4816]: E0316 00:09:52.667657 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jqsjn" podUID="84360ef9-0450-44c5-80eb-eab1bf8e808b" Mar 16 00:09:52 crc kubenswrapper[4816]: E0316 00:09:52.803048 4816 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 16 00:09:54 crc kubenswrapper[4816]: I0316 00:09:54.667167 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jqsjn" Mar 16 00:09:54 crc kubenswrapper[4816]: I0316 00:09:54.667205 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 16 00:09:54 crc kubenswrapper[4816]: I0316 00:09:54.667244 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 16 00:09:54 crc kubenswrapper[4816]: E0316 00:09:54.667367 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jqsjn" podUID="84360ef9-0450-44c5-80eb-eab1bf8e808b" Mar 16 00:09:54 crc kubenswrapper[4816]: I0316 00:09:54.667398 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 16 00:09:54 crc kubenswrapper[4816]: E0316 00:09:54.667764 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 16 00:09:54 crc kubenswrapper[4816]: E0316 00:09:54.667543 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 16 00:09:54 crc kubenswrapper[4816]: E0316 00:09:54.667869 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 16 00:09:56 crc kubenswrapper[4816]: I0316 00:09:56.666684 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 16 00:09:56 crc kubenswrapper[4816]: I0316 00:09:56.666734 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 16 00:09:56 crc kubenswrapper[4816]: I0316 00:09:56.666754 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 16 00:09:56 crc kubenswrapper[4816]: I0316 00:09:56.666691 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jqsjn" Mar 16 00:09:56 crc kubenswrapper[4816]: E0316 00:09:56.666911 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 16 00:09:56 crc kubenswrapper[4816]: E0316 00:09:56.667128 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 16 00:09:56 crc kubenswrapper[4816]: E0316 00:09:56.667495 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jqsjn" podUID="84360ef9-0450-44c5-80eb-eab1bf8e808b" Mar 16 00:09:56 crc kubenswrapper[4816]: E0316 00:09:56.667751 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 16 00:09:57 crc kubenswrapper[4816]: I0316 00:09:57.724597 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podStartSLOduration=57.724516899 podStartE2EDuration="57.724516899s" podCreationTimestamp="2026-03-16 00:09:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-16 00:09:57.703988684 +0000 UTC m=+190.800288667" watchObservedRunningTime="2026-03-16 00:09:57.724516899 +0000 UTC m=+190.820816892" Mar 16 00:09:57 crc kubenswrapper[4816]: I0316 00:09:57.724957 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podStartSLOduration=72.724945531 podStartE2EDuration="1m12.724945531s" podCreationTimestamp="2026-03-16 00:08:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-16 00:09:57.724928961 +0000 UTC m=+190.821228954" watchObservedRunningTime="2026-03-16 00:09:57.724945531 +0000 UTC m=+190.821245514" Mar 16 00:09:57 crc kubenswrapper[4816]: E0316 00:09:57.803950 4816 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 16 00:09:57 crc kubenswrapper[4816]: I0316 00:09:57.809339 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd/etcd-crc" podStartSLOduration=96.809309155 podStartE2EDuration="1m36.809309155s" podCreationTimestamp="2026-03-16 00:08:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-16 00:09:57.807634906 +0000 UTC m=+190.903934939" watchObservedRunningTime="2026-03-16 00:09:57.809309155 +0000 UTC m=+190.905609188" Mar 16 00:09:57 crc kubenswrapper[4816]: I0316 00:09:57.853688 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-szscw" podStartSLOduration=140.853661701 podStartE2EDuration="2m20.853661701s" podCreationTimestamp="2026-03-16 00:07:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-16 00:09:57.832423276 +0000 UTC m=+190.928723279" watchObservedRunningTime="2026-03-16 00:09:57.853661701 +0000 UTC m=+190.949961694" Mar 16 00:09:57 crc kubenswrapper[4816]: I0316 00:09:57.895310 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-daemon-jrdcz" podStartSLOduration=140.895280856 podStartE2EDuration="2m20.895280856s" podCreationTimestamp="2026-03-16 00:07:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-16 00:09:57.853133555 +0000 UTC m=+190.949433578" watchObservedRunningTime="2026-03-16 00:09:57.895280856 +0000 UTC m=+190.991580849" Mar 16 00:09:57 crc kubenswrapper[4816]: I0316 00:09:57.912338 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" podStartSLOduration=43.912312448 podStartE2EDuration="43.912312448s" podCreationTimestamp="2026-03-16 00:09:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-16 00:09:57.910946597 +0000 UTC m=+191.007246590" watchObservedRunningTime="2026-03-16 00:09:57.912312448 +0000 UTC m=+191.008612441" Mar 16 00:09:57 crc kubenswrapper[4816]: I0316 00:09:57.966899 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-cnhkf" podStartSLOduration=140.966864814 podStartE2EDuration="2m20.966864814s" podCreationTimestamp="2026-03-16 00:07:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-16 00:09:57.966842233 +0000 UTC m=+191.063142226" watchObservedRunningTime="2026-03-16 00:09:57.966864814 +0000 UTC m=+191.063164807" Mar 16 00:09:57 crc kubenswrapper[4816]: I0316 00:09:57.992077 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-kbgw5" podStartSLOduration=139.992054035 podStartE2EDuration="2m19.992054035s" podCreationTimestamp="2026-03-16 00:07:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-16 00:09:57.991041686 +0000 UTC m=+191.087341669" watchObservedRunningTime="2026-03-16 00:09:57.992054035 +0000 UTC m=+191.088354008" Mar 16 00:09:58 crc kubenswrapper[4816]: I0316 00:09:58.013596 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=98.013541158 podStartE2EDuration="1m38.013541158s" podCreationTimestamp="2026-03-16 00:08:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-16 00:09:58.013148596 +0000 UTC m=+191.109448549" watchObservedRunningTime="2026-03-16 00:09:58.013541158 +0000 UTC m=+191.109841151" Mar 16 00:09:58 crc kubenswrapper[4816]: I0316 00:09:58.085963 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-mt7bq" podStartSLOduration=141.08594475 podStartE2EDuration="2m21.08594475s" podCreationTimestamp="2026-03-16 00:07:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-16 00:09:58.085198718 +0000 UTC m=+191.181498681" watchObservedRunningTime="2026-03-16 00:09:58.08594475 +0000 UTC m=+191.182244703" Mar 16 00:09:58 crc kubenswrapper[4816]: I0316 00:09:58.667139 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 16 00:09:58 crc kubenswrapper[4816]: E0316 00:09:58.667320 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 16 00:09:58 crc kubenswrapper[4816]: I0316 00:09:58.667704 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 16 00:09:58 crc kubenswrapper[4816]: I0316 00:09:58.667857 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jqsjn" Mar 16 00:09:58 crc kubenswrapper[4816]: E0316 00:09:58.668006 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jqsjn" podUID="84360ef9-0450-44c5-80eb-eab1bf8e808b" Mar 16 00:09:58 crc kubenswrapper[4816]: I0316 00:09:58.668250 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 16 00:09:58 crc kubenswrapper[4816]: E0316 00:09:58.668256 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 16 00:09:58 crc kubenswrapper[4816]: E0316 00:09:58.668829 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 16 00:10:00 crc kubenswrapper[4816]: I0316 00:10:00.667204 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jqsjn" Mar 16 00:10:00 crc kubenswrapper[4816]: I0316 00:10:00.667814 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 16 00:10:00 crc kubenswrapper[4816]: I0316 00:10:00.667836 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 16 00:10:00 crc kubenswrapper[4816]: E0316 00:10:00.667945 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jqsjn" podUID="84360ef9-0450-44c5-80eb-eab1bf8e808b" Mar 16 00:10:00 crc kubenswrapper[4816]: I0316 00:10:00.668252 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 16 00:10:00 crc kubenswrapper[4816]: E0316 00:10:00.668346 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 16 00:10:00 crc kubenswrapper[4816]: E0316 00:10:00.668599 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 16 00:10:00 crc kubenswrapper[4816]: E0316 00:10:00.668796 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 16 00:10:01 crc kubenswrapper[4816]: I0316 00:10:01.483891 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:10:01 crc kubenswrapper[4816]: I0316 00:10:01.483960 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:10:01 crc kubenswrapper[4816]: I0316 00:10:01.483984 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:10:01 crc kubenswrapper[4816]: I0316 00:10:01.484013 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:10:01 crc kubenswrapper[4816]: I0316 00:10:01.484030 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:10:01Z","lastTransitionTime":"2026-03-16T00:10:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:10:01 crc kubenswrapper[4816]: I0316 00:10:01.550921 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-lhpbn" podStartSLOduration=144.550890242 podStartE2EDuration="2m24.550890242s" podCreationTimestamp="2026-03-16 00:07:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-16 00:09:58.105763483 +0000 UTC m=+191.202063436" watchObservedRunningTime="2026-03-16 00:10:01.550890242 +0000 UTC m=+194.647190235" Mar 16 00:10:01 crc kubenswrapper[4816]: I0316 00:10:01.552825 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-version/cluster-version-operator-5c965bbfc6-dcpnr"] Mar 16 00:10:01 crc kubenswrapper[4816]: I0316 00:10:01.553319 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-dcpnr" Mar 16 00:10:01 crc kubenswrapper[4816]: I0316 00:10:01.558458 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Mar 16 00:10:01 crc kubenswrapper[4816]: I0316 00:10:01.558458 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Mar 16 00:10:01 crc kubenswrapper[4816]: I0316 00:10:01.558887 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Mar 16 00:10:01 crc kubenswrapper[4816]: I0316 00:10:01.559032 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Mar 16 00:10:01 crc kubenswrapper[4816]: I0316 00:10:01.668722 4816 scope.go:117] "RemoveContainer" containerID="ba7195b779774df29cc722244d9b4290e7f75033f6e135e41288fbb5310a7c3c" Mar 16 00:10:01 crc kubenswrapper[4816]: E0316 00:10:01.669049 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-psjs7_openshift-ovn-kubernetes(2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88)\"" pod="openshift-ovn-kubernetes/ovnkube-node-psjs7" podUID="2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88" Mar 16 00:10:01 crc kubenswrapper[4816]: I0316 00:10:01.671788 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/c33fcbae-202d-40ad-a561-e15eddf3cb4c-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-dcpnr\" (UID: \"c33fcbae-202d-40ad-a561-e15eddf3cb4c\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-dcpnr" Mar 16 00:10:01 crc kubenswrapper[4816]: I0316 00:10:01.672158 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/c33fcbae-202d-40ad-a561-e15eddf3cb4c-service-ca\") pod \"cluster-version-operator-5c965bbfc6-dcpnr\" (UID: \"c33fcbae-202d-40ad-a561-e15eddf3cb4c\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-dcpnr" Mar 16 00:10:01 crc kubenswrapper[4816]: I0316 00:10:01.672379 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c33fcbae-202d-40ad-a561-e15eddf3cb4c-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-dcpnr\" (UID: \"c33fcbae-202d-40ad-a561-e15eddf3cb4c\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-dcpnr" Mar 16 00:10:01 crc kubenswrapper[4816]: I0316 00:10:01.672688 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/c33fcbae-202d-40ad-a561-e15eddf3cb4c-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-dcpnr\" (UID: \"c33fcbae-202d-40ad-a561-e15eddf3cb4c\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-dcpnr" Mar 16 00:10:01 crc kubenswrapper[4816]: I0316 00:10:01.672996 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c33fcbae-202d-40ad-a561-e15eddf3cb4c-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-dcpnr\" (UID: \"c33fcbae-202d-40ad-a561-e15eddf3cb4c\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-dcpnr" Mar 16 00:10:01 crc kubenswrapper[4816]: I0316 00:10:01.712243 4816 certificate_manager.go:356] kubernetes.io/kubelet-serving: Rotating certificates Mar 16 00:10:01 crc kubenswrapper[4816]: I0316 00:10:01.721936 4816 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Mar 16 00:10:01 crc kubenswrapper[4816]: I0316 00:10:01.774618 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c33fcbae-202d-40ad-a561-e15eddf3cb4c-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-dcpnr\" (UID: \"c33fcbae-202d-40ad-a561-e15eddf3cb4c\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-dcpnr" Mar 16 00:10:01 crc kubenswrapper[4816]: I0316 00:10:01.774760 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/c33fcbae-202d-40ad-a561-e15eddf3cb4c-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-dcpnr\" (UID: \"c33fcbae-202d-40ad-a561-e15eddf3cb4c\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-dcpnr" Mar 16 00:10:01 crc kubenswrapper[4816]: I0316 00:10:01.774812 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/c33fcbae-202d-40ad-a561-e15eddf3cb4c-service-ca\") pod \"cluster-version-operator-5c965bbfc6-dcpnr\" (UID: \"c33fcbae-202d-40ad-a561-e15eddf3cb4c\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-dcpnr" Mar 16 00:10:01 crc kubenswrapper[4816]: I0316 00:10:01.774843 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c33fcbae-202d-40ad-a561-e15eddf3cb4c-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-dcpnr\" (UID: \"c33fcbae-202d-40ad-a561-e15eddf3cb4c\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-dcpnr" Mar 16 00:10:01 crc kubenswrapper[4816]: I0316 00:10:01.774872 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/c33fcbae-202d-40ad-a561-e15eddf3cb4c-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-dcpnr\" (UID: \"c33fcbae-202d-40ad-a561-e15eddf3cb4c\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-dcpnr" Mar 16 00:10:01 crc kubenswrapper[4816]: I0316 00:10:01.774928 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/c33fcbae-202d-40ad-a561-e15eddf3cb4c-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-dcpnr\" (UID: \"c33fcbae-202d-40ad-a561-e15eddf3cb4c\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-dcpnr" Mar 16 00:10:01 crc kubenswrapper[4816]: I0316 00:10:01.775137 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/c33fcbae-202d-40ad-a561-e15eddf3cb4c-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-dcpnr\" (UID: \"c33fcbae-202d-40ad-a561-e15eddf3cb4c\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-dcpnr" Mar 16 00:10:01 crc kubenswrapper[4816]: I0316 00:10:01.776127 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/c33fcbae-202d-40ad-a561-e15eddf3cb4c-service-ca\") pod \"cluster-version-operator-5c965bbfc6-dcpnr\" (UID: \"c33fcbae-202d-40ad-a561-e15eddf3cb4c\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-dcpnr" Mar 16 00:10:01 crc kubenswrapper[4816]: I0316 00:10:01.784846 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c33fcbae-202d-40ad-a561-e15eddf3cb4c-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-dcpnr\" (UID: \"c33fcbae-202d-40ad-a561-e15eddf3cb4c\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-dcpnr" Mar 16 00:10:01 crc kubenswrapper[4816]: I0316 00:10:01.796254 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c33fcbae-202d-40ad-a561-e15eddf3cb4c-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-dcpnr\" (UID: \"c33fcbae-202d-40ad-a561-e15eddf3cb4c\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-dcpnr" Mar 16 00:10:01 crc kubenswrapper[4816]: I0316 00:10:01.876370 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-dcpnr" Mar 16 00:10:02 crc kubenswrapper[4816]: I0316 00:10:02.614100 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-dcpnr" event={"ID":"c33fcbae-202d-40ad-a561-e15eddf3cb4c","Type":"ContainerStarted","Data":"f1a3b2e86078770d109d211ccf2fee60918ab6e50d5b768bd6a9ff73c900fcdd"} Mar 16 00:10:02 crc kubenswrapper[4816]: I0316 00:10:02.614465 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-dcpnr" event={"ID":"c33fcbae-202d-40ad-a561-e15eddf3cb4c","Type":"ContainerStarted","Data":"be6102f089f6cd1ae95738a8b57c8ba13e4c5558709a37a45d02234f149ca5ce"} Mar 16 00:10:02 crc kubenswrapper[4816]: I0316 00:10:02.633868 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-dcpnr" podStartSLOduration=145.633846616 podStartE2EDuration="2m25.633846616s" podCreationTimestamp="2026-03-16 00:07:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-16 00:10:02.632718093 +0000 UTC m=+195.729018086" watchObservedRunningTime="2026-03-16 00:10:02.633846616 +0000 UTC m=+195.730146609" Mar 16 00:10:02 crc kubenswrapper[4816]: I0316 00:10:02.666824 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 16 00:10:02 crc kubenswrapper[4816]: I0316 00:10:02.667330 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jqsjn" Mar 16 00:10:02 crc kubenswrapper[4816]: I0316 00:10:02.667242 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 16 00:10:02 crc kubenswrapper[4816]: I0316 00:10:02.667207 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 16 00:10:02 crc kubenswrapper[4816]: E0316 00:10:02.667854 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 16 00:10:02 crc kubenswrapper[4816]: E0316 00:10:02.668229 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 16 00:10:02 crc kubenswrapper[4816]: E0316 00:10:02.668316 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jqsjn" podUID="84360ef9-0450-44c5-80eb-eab1bf8e808b" Mar 16 00:10:02 crc kubenswrapper[4816]: E0316 00:10:02.668409 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 16 00:10:02 crc kubenswrapper[4816]: E0316 00:10:02.805608 4816 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 16 00:10:04 crc kubenswrapper[4816]: I0316 00:10:04.667328 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 16 00:10:04 crc kubenswrapper[4816]: I0316 00:10:04.667384 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 16 00:10:04 crc kubenswrapper[4816]: E0316 00:10:04.667452 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 16 00:10:04 crc kubenswrapper[4816]: I0316 00:10:04.667339 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jqsjn" Mar 16 00:10:04 crc kubenswrapper[4816]: E0316 00:10:04.667667 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 16 00:10:04 crc kubenswrapper[4816]: E0316 00:10:04.667719 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jqsjn" podUID="84360ef9-0450-44c5-80eb-eab1bf8e808b" Mar 16 00:10:04 crc kubenswrapper[4816]: I0316 00:10:04.668338 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 16 00:10:04 crc kubenswrapper[4816]: E0316 00:10:04.668639 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 16 00:10:05 crc kubenswrapper[4816]: I0316 00:10:05.627222 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-szscw_e9789e58-12c8-4831-9401-af48a3e92209/kube-multus/1.log" Mar 16 00:10:05 crc kubenswrapper[4816]: I0316 00:10:05.627972 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-szscw_e9789e58-12c8-4831-9401-af48a3e92209/kube-multus/0.log" Mar 16 00:10:05 crc kubenswrapper[4816]: I0316 00:10:05.628054 4816 generic.go:334] "Generic (PLEG): container finished" podID="e9789e58-12c8-4831-9401-af48a3e92209" containerID="b45d6058e72f7117fdfb86b3480de77c8592dba7dcd7932b2a5c34036da7af26" exitCode=1 Mar 16 00:10:05 crc kubenswrapper[4816]: I0316 00:10:05.628102 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-szscw" event={"ID":"e9789e58-12c8-4831-9401-af48a3e92209","Type":"ContainerDied","Data":"b45d6058e72f7117fdfb86b3480de77c8592dba7dcd7932b2a5c34036da7af26"} Mar 16 00:10:05 crc kubenswrapper[4816]: I0316 00:10:05.628149 4816 scope.go:117] "RemoveContainer" containerID="e6f0b7e8f95bbbf3b90133349b8b13d2784582a5d95dae8dd159328b570ae962" Mar 16 00:10:05 crc kubenswrapper[4816]: I0316 00:10:05.628928 4816 scope.go:117] "RemoveContainer" containerID="b45d6058e72f7117fdfb86b3480de77c8592dba7dcd7932b2a5c34036da7af26" Mar 16 00:10:05 crc kubenswrapper[4816]: E0316 00:10:05.629294 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-multus pod=multus-szscw_openshift-multus(e9789e58-12c8-4831-9401-af48a3e92209)\"" pod="openshift-multus/multus-szscw" podUID="e9789e58-12c8-4831-9401-af48a3e92209" Mar 16 00:10:06 crc kubenswrapper[4816]: I0316 00:10:06.634314 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-szscw_e9789e58-12c8-4831-9401-af48a3e92209/kube-multus/1.log" Mar 16 00:10:06 crc kubenswrapper[4816]: I0316 00:10:06.667148 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 16 00:10:06 crc kubenswrapper[4816]: I0316 00:10:06.667284 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jqsjn" Mar 16 00:10:06 crc kubenswrapper[4816]: I0316 00:10:06.667327 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 16 00:10:06 crc kubenswrapper[4816]: I0316 00:10:06.667389 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 16 00:10:06 crc kubenswrapper[4816]: E0316 00:10:06.667386 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 16 00:10:06 crc kubenswrapper[4816]: E0316 00:10:06.667587 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 16 00:10:06 crc kubenswrapper[4816]: E0316 00:10:06.667637 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 16 00:10:06 crc kubenswrapper[4816]: E0316 00:10:06.668038 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jqsjn" podUID="84360ef9-0450-44c5-80eb-eab1bf8e808b" Mar 16 00:10:07 crc kubenswrapper[4816]: E0316 00:10:07.806799 4816 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 16 00:10:08 crc kubenswrapper[4816]: I0316 00:10:08.667545 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jqsjn" Mar 16 00:10:08 crc kubenswrapper[4816]: I0316 00:10:08.667617 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 16 00:10:08 crc kubenswrapper[4816]: I0316 00:10:08.667682 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 16 00:10:08 crc kubenswrapper[4816]: E0316 00:10:08.667809 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jqsjn" podUID="84360ef9-0450-44c5-80eb-eab1bf8e808b" Mar 16 00:10:08 crc kubenswrapper[4816]: I0316 00:10:08.667832 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 16 00:10:08 crc kubenswrapper[4816]: E0316 00:10:08.667980 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 16 00:10:08 crc kubenswrapper[4816]: E0316 00:10:08.668069 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 16 00:10:08 crc kubenswrapper[4816]: E0316 00:10:08.668160 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 16 00:10:10 crc kubenswrapper[4816]: I0316 00:10:10.667432 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 16 00:10:10 crc kubenswrapper[4816]: I0316 00:10:10.667431 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 16 00:10:10 crc kubenswrapper[4816]: E0316 00:10:10.667668 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 16 00:10:10 crc kubenswrapper[4816]: I0316 00:10:10.667441 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jqsjn" Mar 16 00:10:10 crc kubenswrapper[4816]: E0316 00:10:10.667871 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jqsjn" podUID="84360ef9-0450-44c5-80eb-eab1bf8e808b" Mar 16 00:10:10 crc kubenswrapper[4816]: E0316 00:10:10.667724 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 16 00:10:10 crc kubenswrapper[4816]: I0316 00:10:10.668416 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 16 00:10:10 crc kubenswrapper[4816]: E0316 00:10:10.668627 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 16 00:10:12 crc kubenswrapper[4816]: I0316 00:10:12.666696 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 16 00:10:12 crc kubenswrapper[4816]: I0316 00:10:12.666756 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 16 00:10:12 crc kubenswrapper[4816]: E0316 00:10:12.666860 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 16 00:10:12 crc kubenswrapper[4816]: I0316 00:10:12.666714 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jqsjn" Mar 16 00:10:12 crc kubenswrapper[4816]: I0316 00:10:12.670711 4816 scope.go:117] "RemoveContainer" containerID="ba7195b779774df29cc722244d9b4290e7f75033f6e135e41288fbb5310a7c3c" Mar 16 00:10:12 crc kubenswrapper[4816]: I0316 00:10:12.671675 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 16 00:10:12 crc kubenswrapper[4816]: E0316 00:10:12.672138 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 16 00:10:12 crc kubenswrapper[4816]: E0316 00:10:12.672946 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 16 00:10:12 crc kubenswrapper[4816]: E0316 00:10:12.673271 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jqsjn" podUID="84360ef9-0450-44c5-80eb-eab1bf8e808b" Mar 16 00:10:12 crc kubenswrapper[4816]: E0316 00:10:12.807954 4816 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 16 00:10:13 crc kubenswrapper[4816]: I0316 00:10:13.619834 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-jqsjn"] Mar 16 00:10:13 crc kubenswrapper[4816]: I0316 00:10:13.663805 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-psjs7_2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88/ovnkube-controller/3.log" Mar 16 00:10:13 crc kubenswrapper[4816]: I0316 00:10:13.666792 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jqsjn" Mar 16 00:10:13 crc kubenswrapper[4816]: E0316 00:10:13.666965 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jqsjn" podUID="84360ef9-0450-44c5-80eb-eab1bf8e808b" Mar 16 00:10:13 crc kubenswrapper[4816]: I0316 00:10:13.672015 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-psjs7" event={"ID":"2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88","Type":"ContainerStarted","Data":"7e5d87dc1889484bb2175c0613eda0b852c65a289a1c165f6adae2a822892aa2"} Mar 16 00:10:13 crc kubenswrapper[4816]: I0316 00:10:13.672622 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-psjs7" Mar 16 00:10:13 crc kubenswrapper[4816]: I0316 00:10:13.716813 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-psjs7" podStartSLOduration=156.716792595 podStartE2EDuration="2m36.716792595s" podCreationTimestamp="2026-03-16 00:07:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-16 00:10:13.716124667 +0000 UTC m=+206.812424630" watchObservedRunningTime="2026-03-16 00:10:13.716792595 +0000 UTC m=+206.813092548" Mar 16 00:10:14 crc kubenswrapper[4816]: I0316 00:10:14.667310 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 16 00:10:14 crc kubenswrapper[4816]: I0316 00:10:14.667345 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 16 00:10:14 crc kubenswrapper[4816]: I0316 00:10:14.667375 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 16 00:10:14 crc kubenswrapper[4816]: E0316 00:10:14.667514 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 16 00:10:14 crc kubenswrapper[4816]: E0316 00:10:14.667753 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 16 00:10:14 crc kubenswrapper[4816]: E0316 00:10:14.667921 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 16 00:10:15 crc kubenswrapper[4816]: I0316 00:10:15.667322 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jqsjn" Mar 16 00:10:15 crc kubenswrapper[4816]: E0316 00:10:15.667536 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jqsjn" podUID="84360ef9-0450-44c5-80eb-eab1bf8e808b" Mar 16 00:10:16 crc kubenswrapper[4816]: I0316 00:10:16.591617 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 16 00:10:16 crc kubenswrapper[4816]: E0316 00:10:16.591944 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-16 00:12:18.591894213 +0000 UTC m=+331.688194166 (durationBeforeRetry 2m2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:16 crc kubenswrapper[4816]: I0316 00:10:16.667659 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 16 00:10:16 crc kubenswrapper[4816]: I0316 00:10:16.667779 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 16 00:10:16 crc kubenswrapper[4816]: I0316 00:10:16.667666 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 16 00:10:16 crc kubenswrapper[4816]: E0316 00:10:16.667899 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 16 00:10:16 crc kubenswrapper[4816]: E0316 00:10:16.668016 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 16 00:10:16 crc kubenswrapper[4816]: E0316 00:10:16.668249 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 16 00:10:16 crc kubenswrapper[4816]: I0316 00:10:16.692895 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 16 00:10:16 crc kubenswrapper[4816]: I0316 00:10:16.692982 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 16 00:10:16 crc kubenswrapper[4816]: I0316 00:10:16.693033 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 16 00:10:16 crc kubenswrapper[4816]: I0316 00:10:16.693070 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 16 00:10:16 crc kubenswrapper[4816]: E0316 00:10:16.693112 4816 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 16 00:10:16 crc kubenswrapper[4816]: E0316 00:10:16.693143 4816 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 16 00:10:16 crc kubenswrapper[4816]: E0316 00:10:16.693211 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-16 00:12:18.693188919 +0000 UTC m=+331.789488932 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 16 00:10:16 crc kubenswrapper[4816]: E0316 00:10:16.693236 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-16 00:12:18.69322644 +0000 UTC m=+331.789526493 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 16 00:10:16 crc kubenswrapper[4816]: E0316 00:10:16.693249 4816 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 16 00:10:16 crc kubenswrapper[4816]: E0316 00:10:16.693279 4816 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 16 00:10:16 crc kubenswrapper[4816]: E0316 00:10:16.693298 4816 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 16 00:10:16 crc kubenswrapper[4816]: E0316 00:10:16.693348 4816 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 16 00:10:16 crc kubenswrapper[4816]: E0316 00:10:16.693407 4816 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 16 00:10:16 crc kubenswrapper[4816]: E0316 00:10:16.693425 4816 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 16 00:10:16 crc kubenswrapper[4816]: E0316 00:10:16.693373 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-16 00:12:18.693351893 +0000 UTC m=+331.789651886 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 16 00:10:16 crc kubenswrapper[4816]: E0316 00:10:16.693534 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-16 00:12:18.693508898 +0000 UTC m=+331.789808851 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 16 00:10:17 crc kubenswrapper[4816]: I0316 00:10:17.667257 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jqsjn" Mar 16 00:10:17 crc kubenswrapper[4816]: E0316 00:10:17.669930 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jqsjn" podUID="84360ef9-0450-44c5-80eb-eab1bf8e808b" Mar 16 00:10:17 crc kubenswrapper[4816]: E0316 00:10:17.808844 4816 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 16 00:10:18 crc kubenswrapper[4816]: I0316 00:10:18.667034 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 16 00:10:18 crc kubenswrapper[4816]: I0316 00:10:18.667211 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 16 00:10:18 crc kubenswrapper[4816]: I0316 00:10:18.667348 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 16 00:10:18 crc kubenswrapper[4816]: E0316 00:10:18.667340 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 16 00:10:18 crc kubenswrapper[4816]: E0316 00:10:18.667524 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 16 00:10:18 crc kubenswrapper[4816]: E0316 00:10:18.667874 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 16 00:10:18 crc kubenswrapper[4816]: I0316 00:10:18.668179 4816 scope.go:117] "RemoveContainer" containerID="b45d6058e72f7117fdfb86b3480de77c8592dba7dcd7932b2a5c34036da7af26" Mar 16 00:10:19 crc kubenswrapper[4816]: I0316 00:10:19.666694 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jqsjn" Mar 16 00:10:19 crc kubenswrapper[4816]: E0316 00:10:19.667238 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jqsjn" podUID="84360ef9-0450-44c5-80eb-eab1bf8e808b" Mar 16 00:10:19 crc kubenswrapper[4816]: I0316 00:10:19.686448 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-szscw_e9789e58-12c8-4831-9401-af48a3e92209/kube-multus/1.log" Mar 16 00:10:19 crc kubenswrapper[4816]: I0316 00:10:19.686531 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-szscw" event={"ID":"e9789e58-12c8-4831-9401-af48a3e92209","Type":"ContainerStarted","Data":"707ec2df051aa6206ac2bc1c4db6b5fe6b37467b90b6ee42dbf28f2b88e5d6e6"} Mar 16 00:10:20 crc kubenswrapper[4816]: I0316 00:10:20.667287 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 16 00:10:20 crc kubenswrapper[4816]: I0316 00:10:20.667372 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 16 00:10:20 crc kubenswrapper[4816]: I0316 00:10:20.667323 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 16 00:10:20 crc kubenswrapper[4816]: E0316 00:10:20.667518 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 16 00:10:20 crc kubenswrapper[4816]: E0316 00:10:20.667741 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 16 00:10:20 crc kubenswrapper[4816]: E0316 00:10:20.667875 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 16 00:10:21 crc kubenswrapper[4816]: I0316 00:10:21.666885 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jqsjn" Mar 16 00:10:21 crc kubenswrapper[4816]: E0316 00:10:21.667507 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jqsjn" podUID="84360ef9-0450-44c5-80eb-eab1bf8e808b" Mar 16 00:10:22 crc kubenswrapper[4816]: I0316 00:10:22.667010 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 16 00:10:22 crc kubenswrapper[4816]: E0316 00:10:22.667228 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 16 00:10:22 crc kubenswrapper[4816]: I0316 00:10:22.667051 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 16 00:10:22 crc kubenswrapper[4816]: E0316 00:10:22.667419 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 16 00:10:22 crc kubenswrapper[4816]: I0316 00:10:22.667011 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 16 00:10:22 crc kubenswrapper[4816]: E0316 00:10:22.667516 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 16 00:10:23 crc kubenswrapper[4816]: I0316 00:10:23.666907 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jqsjn" Mar 16 00:10:23 crc kubenswrapper[4816]: I0316 00:10:23.668993 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Mar 16 00:10:23 crc kubenswrapper[4816]: I0316 00:10:23.669627 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Mar 16 00:10:24 crc kubenswrapper[4816]: I0316 00:10:24.666918 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 16 00:10:24 crc kubenswrapper[4816]: I0316 00:10:24.666917 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 16 00:10:24 crc kubenswrapper[4816]: I0316 00:10:24.667110 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 16 00:10:24 crc kubenswrapper[4816]: I0316 00:10:24.669865 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Mar 16 00:10:24 crc kubenswrapper[4816]: I0316 00:10:24.669912 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Mar 16 00:10:24 crc kubenswrapper[4816]: I0316 00:10:24.669918 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Mar 16 00:10:24 crc kubenswrapper[4816]: I0316 00:10:24.670226 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Mar 16 00:10:31 crc kubenswrapper[4816]: I0316 00:10:31.944937 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeReady" Mar 16 00:10:31 crc kubenswrapper[4816]: I0316 00:10:31.993869 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-f9d7485db-nnqsw"] Mar 16 00:10:31 crc kubenswrapper[4816]: I0316 00:10:31.994439 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-nnqsw" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.000177 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-6r96v"] Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.000510 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-6r96v" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.001603 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.001756 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.001766 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.001880 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.002091 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.002278 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.002405 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.002464 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.004088 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-9xv4p"] Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.004694 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-9xv4p" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.005720 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-rqxss"] Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.006470 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-rqxss" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.007438 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-2l7nk"] Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.007878 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-2l7nk" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.008427 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-v8zx4"] Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.008786 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-v8zx4" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.012383 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-pdm8d"] Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.012943 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-fnmb9"] Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.013161 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-pdm8d" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.014110 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-machine-approver/machine-approver-56656f9798-n7fkv"] Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.014519 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-n7fkv" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.014702 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-fnmb9" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.015265 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-zrq8d"] Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.015791 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zrq8d" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.016351 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-d9j8j"] Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.017194 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-d9j8j" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.017706 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-s8mgz"] Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.034745 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-s8mgz" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.036052 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.036240 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.036446 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.036682 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.036955 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.037443 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.037577 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.037860 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.038236 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.036462 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.037476 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.037513 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.038783 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.039077 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.049256 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.049430 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.049562 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.049681 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.049817 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.049885 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.049940 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.049944 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.049892 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.050151 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.050179 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.050244 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.050310 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.050595 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.050637 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.050699 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.050721 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-tv2n7"] Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.051288 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-tv2n7" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.051358 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9r9tr\" (UniqueName: \"kubernetes.io/projected/1f26ea52-1f97-4d4a-98bd-897c5b3b88c5-kube-api-access-9r9tr\") pod \"apiserver-76f77b778f-pdm8d\" (UID: \"1f26ea52-1f97-4d4a-98bd-897c5b3b88c5\") " pod="openshift-apiserver/apiserver-76f77b778f-pdm8d" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.051295 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.051544 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1f26ea52-1f97-4d4a-98bd-897c5b3b88c5-encryption-config\") pod \"apiserver-76f77b778f-pdm8d\" (UID: \"1f26ea52-1f97-4d4a-98bd-897c5b3b88c5\") " pod="openshift-apiserver/apiserver-76f77b778f-pdm8d" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.051595 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.051709 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vxgff\" (UniqueName: \"kubernetes.io/projected/c71f28a1-a68d-41c2-a9e6-4984e2e22c74-kube-api-access-vxgff\") pod \"etcd-operator-b45778765-2l7nk\" (UID: \"c71f28a1-a68d-41c2-a9e6-4984e2e22c74\") " pod="openshift-etcd-operator/etcd-operator-b45778765-2l7nk" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.051731 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/c71f28a1-a68d-41c2-a9e6-4984e2e22c74-etcd-service-ca\") pod \"etcd-operator-b45778765-2l7nk\" (UID: \"c71f28a1-a68d-41c2-a9e6-4984e2e22c74\") " pod="openshift-etcd-operator/etcd-operator-b45778765-2l7nk" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.051742 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.051747 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/1f26ea52-1f97-4d4a-98bd-897c5b3b88c5-audit-dir\") pod \"apiserver-76f77b778f-pdm8d\" (UID: \"1f26ea52-1f97-4d4a-98bd-897c5b3b88c5\") " pod="openshift-apiserver/apiserver-76f77b778f-pdm8d" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.051765 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1f26ea52-1f97-4d4a-98bd-897c5b3b88c5-etcd-serving-ca\") pod \"apiserver-76f77b778f-pdm8d\" (UID: \"1f26ea52-1f97-4d4a-98bd-897c5b3b88c5\") " pod="openshift-apiserver/apiserver-76f77b778f-pdm8d" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.051821 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.051889 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c71f28a1-a68d-41c2-a9e6-4984e2e22c74-serving-cert\") pod \"etcd-operator-b45778765-2l7nk\" (UID: \"c71f28a1-a68d-41c2-a9e6-4984e2e22c74\") " pod="openshift-etcd-operator/etcd-operator-b45778765-2l7nk" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.051912 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/1f26ea52-1f97-4d4a-98bd-897c5b3b88c5-node-pullsecrets\") pod \"apiserver-76f77b778f-pdm8d\" (UID: \"1f26ea52-1f97-4d4a-98bd-897c5b3b88c5\") " pod="openshift-apiserver/apiserver-76f77b778f-pdm8d" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.051929 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1f26ea52-1f97-4d4a-98bd-897c5b3b88c5-serving-cert\") pod \"apiserver-76f77b778f-pdm8d\" (UID: \"1f26ea52-1f97-4d4a-98bd-897c5b3b88c5\") " pod="openshift-apiserver/apiserver-76f77b778f-pdm8d" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.051943 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1f26ea52-1f97-4d4a-98bd-897c5b3b88c5-audit\") pod \"apiserver-76f77b778f-pdm8d\" (UID: \"1f26ea52-1f97-4d4a-98bd-897c5b3b88c5\") " pod="openshift-apiserver/apiserver-76f77b778f-pdm8d" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.051960 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1f26ea52-1f97-4d4a-98bd-897c5b3b88c5-image-import-ca\") pod \"apiserver-76f77b778f-pdm8d\" (UID: \"1f26ea52-1f97-4d4a-98bd-897c5b3b88c5\") " pod="openshift-apiserver/apiserver-76f77b778f-pdm8d" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.051968 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.051973 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1f26ea52-1f97-4d4a-98bd-897c5b3b88c5-trusted-ca-bundle\") pod \"apiserver-76f77b778f-pdm8d\" (UID: \"1f26ea52-1f97-4d4a-98bd-897c5b3b88c5\") " pod="openshift-apiserver/apiserver-76f77b778f-pdm8d" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.051987 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1f26ea52-1f97-4d4a-98bd-897c5b3b88c5-config\") pod \"apiserver-76f77b778f-pdm8d\" (UID: \"1f26ea52-1f97-4d4a-98bd-897c5b3b88c5\") " pod="openshift-apiserver/apiserver-76f77b778f-pdm8d" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.052008 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1f26ea52-1f97-4d4a-98bd-897c5b3b88c5-etcd-client\") pod \"apiserver-76f77b778f-pdm8d\" (UID: \"1f26ea52-1f97-4d4a-98bd-897c5b3b88c5\") " pod="openshift-apiserver/apiserver-76f77b778f-pdm8d" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.052022 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/c71f28a1-a68d-41c2-a9e6-4984e2e22c74-etcd-ca\") pod \"etcd-operator-b45778765-2l7nk\" (UID: \"c71f28a1-a68d-41c2-a9e6-4984e2e22c74\") " pod="openshift-etcd-operator/etcd-operator-b45778765-2l7nk" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.051343 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.052090 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.052037 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/c71f28a1-a68d-41c2-a9e6-4984e2e22c74-etcd-client\") pod \"etcd-operator-b45778765-2l7nk\" (UID: \"c71f28a1-a68d-41c2-a9e6-4984e2e22c74\") " pod="openshift-etcd-operator/etcd-operator-b45778765-2l7nk" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.051448 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.052153 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c71f28a1-a68d-41c2-a9e6-4984e2e22c74-config\") pod \"etcd-operator-b45778765-2l7nk\" (UID: \"c71f28a1-a68d-41c2-a9e6-4984e2e22c74\") " pod="openshift-etcd-operator/etcd-operator-b45778765-2l7nk" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.051504 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.052248 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.052260 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.052322 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.052354 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.052376 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.052457 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.052617 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.053434 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-58897d9998-q9xc9"] Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.053961 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-sshl5"] Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.054382 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-sshl5" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.054860 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-pruner-29560320-s9q72"] Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.054932 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-q9xc9" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.055471 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-pruner-29560320-s9q72" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.055837 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.055993 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.056309 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.057294 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.057639 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-7954f5f757-5rr7c"] Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.058149 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-5rr7c" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.058871 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.058976 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.059116 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.059174 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.059237 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-d4vrm"] Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.059934 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-d4vrm" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.060202 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-sh4ps"] Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.060599 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-sh4ps" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.061935 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-ckvwn"] Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.062045 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.062437 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-l648b"] Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.062964 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-l648b" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.063238 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-ckvwn" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.063541 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-mwhpz"] Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.064189 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-mwhpz" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.064274 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-gtwmf"] Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.064903 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-gtwmf" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.067149 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-tgbjg"] Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.068052 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-tgbjg" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.068949 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-t8c6x"] Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.069746 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-t8c6x" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.070377 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-4t6gm"] Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.071139 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-4t6gm" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.072627 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-nsxl4"] Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.073216 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-nsxl4" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.073594 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-5444994796-gvk75"] Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.074074 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-gvk75" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.079112 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-jm8db"] Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.080596 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29560320-4hk5d"] Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.081032 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-ll5r8"] Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.081178 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-jm8db" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.081484 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-ll5r8" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.081516 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-fwkzt"] Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.081693 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29560320-4hk5d" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.082519 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-fwkzt" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.091894 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-zn6w7"] Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.092264 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.095523 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.095824 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.096097 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.097314 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.097723 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.097924 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-8tt9t"] Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.098092 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.098388 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.099608 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.110195 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.111061 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-zn6w7" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.111362 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.111365 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.111738 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.118856 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.130502 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.131097 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-8tt9t" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.131286 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.137256 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-8226q"] Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.137821 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.138222 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.138345 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.138492 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.138618 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.138726 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.138842 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.139886 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.140081 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.140328 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.142161 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.142335 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.143839 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.144074 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.144157 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.144333 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-fk9l7"] Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.144943 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-vvdz2"] Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.145123 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-8226q" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.145141 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.145222 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-fk9l7" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.146106 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-vvdz2" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.145285 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.148826 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.149579 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29560330-44pts"] Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.150417 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29560330-44pts" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.152745 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-mplx7"] Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.153619 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/32cac2d0-56f0-4ba8-86dc-9b57c0fcc11c-oauth-serving-cert\") pod \"console-f9d7485db-nnqsw\" (UID: \"32cac2d0-56f0-4ba8-86dc-9b57c0fcc11c\") " pod="openshift-console/console-f9d7485db-nnqsw" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.153732 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1d5466ab-a589-4f7e-ae89-2f494b10f6b1-config\") pod \"route-controller-manager-6576b87f9c-d9j8j\" (UID: \"1d5466ab-a589-4f7e-ae89-2f494b10f6b1\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-d9j8j" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.153808 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/32cac2d0-56f0-4ba8-86dc-9b57c0fcc11c-console-config\") pod \"console-f9d7485db-nnqsw\" (UID: \"32cac2d0-56f0-4ba8-86dc-9b57c0fcc11c\") " pod="openshift-console/console-f9d7485db-nnqsw" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.153884 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vxgff\" (UniqueName: \"kubernetes.io/projected/c71f28a1-a68d-41c2-a9e6-4984e2e22c74-kube-api-access-vxgff\") pod \"etcd-operator-b45778765-2l7nk\" (UID: \"c71f28a1-a68d-41c2-a9e6-4984e2e22c74\") " pod="openshift-etcd-operator/etcd-operator-b45778765-2l7nk" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.153950 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-6nkm6"] Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.154021 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/32cac2d0-56f0-4ba8-86dc-9b57c0fcc11c-service-ca\") pod \"console-f9d7485db-nnqsw\" (UID: \"32cac2d0-56f0-4ba8-86dc-9b57c0fcc11c\") " pod="openshift-console/console-f9d7485db-nnqsw" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.154092 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1306b657-0022-435d-bb72-793f1c1a106b-config\") pod \"machine-api-operator-5694c8668f-9xv4p\" (UID: \"1306b657-0022-435d-bb72-793f1c1a106b\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-9xv4p" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.154159 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/c71f28a1-a68d-41c2-a9e6-4984e2e22c74-etcd-service-ca\") pod \"etcd-operator-b45778765-2l7nk\" (UID: \"c71f28a1-a68d-41c2-a9e6-4984e2e22c74\") " pod="openshift-etcd-operator/etcd-operator-b45778765-2l7nk" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.154229 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xnq8l\" (UniqueName: \"kubernetes.io/projected/32cac2d0-56f0-4ba8-86dc-9b57c0fcc11c-kube-api-access-xnq8l\") pod \"console-f9d7485db-nnqsw\" (UID: \"32cac2d0-56f0-4ba8-86dc-9b57c0fcc11c\") " pod="openshift-console/console-f9d7485db-nnqsw" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.154302 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/5e41d768-3ed4-4760-a0d5-4308d7b13379-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-s8mgz\" (UID: \"5e41d768-3ed4-4760-a0d5-4308d7b13379\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-s8mgz" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.154371 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/1f26ea52-1f97-4d4a-98bd-897c5b3b88c5-audit-dir\") pod \"apiserver-76f77b778f-pdm8d\" (UID: \"1f26ea52-1f97-4d4a-98bd-897c5b3b88c5\") " pod="openshift-apiserver/apiserver-76f77b778f-pdm8d" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.154459 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1f26ea52-1f97-4d4a-98bd-897c5b3b88c5-etcd-serving-ca\") pod \"apiserver-76f77b778f-pdm8d\" (UID: \"1f26ea52-1f97-4d4a-98bd-897c5b3b88c5\") " pod="openshift-apiserver/apiserver-76f77b778f-pdm8d" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.154566 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/1d5466ab-a589-4f7e-ae89-2f494b10f6b1-client-ca\") pod \"route-controller-manager-6576b87f9c-d9j8j\" (UID: \"1d5466ab-a589-4f7e-ae89-2f494b10f6b1\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-d9j8j" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.154677 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/044562bd-df74-47fa-bc8d-1c652233e9c5-serving-cert\") pod \"authentication-operator-69f744f599-6r96v\" (UID: \"044562bd-df74-47fa-bc8d-1c652233e9c5\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-6r96v" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.154752 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.154800 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pzj4l\" (UniqueName: \"kubernetes.io/projected/044562bd-df74-47fa-bc8d-1c652233e9c5-kube-api-access-pzj4l\") pod \"authentication-operator-69f744f599-6r96v\" (UID: \"044562bd-df74-47fa-bc8d-1c652233e9c5\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-6r96v" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.154891 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/1306b657-0022-435d-bb72-793f1c1a106b-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-9xv4p\" (UID: \"1306b657-0022-435d-bb72-793f1c1a106b\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-9xv4p" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.154917 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/5e41d768-3ed4-4760-a0d5-4308d7b13379-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-s8mgz\" (UID: \"5e41d768-3ed4-4760-a0d5-4308d7b13379\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-s8mgz" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.155058 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xdmv5\" (UniqueName: \"kubernetes.io/projected/171b00f7-f7cf-41b3-bffd-11ceeb9f2182-kube-api-access-xdmv5\") pod \"dns-operator-744455d44c-fnmb9\" (UID: \"171b00f7-f7cf-41b3-bffd-11ceeb9f2182\") " pod="openshift-dns-operator/dns-operator-744455d44c-fnmb9" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.155100 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/dced2102-9fd0-4300-9e0a-35d915f1caad-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-zrq8d\" (UID: \"dced2102-9fd0-4300-9e0a-35d915f1caad\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zrq8d" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.155119 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/dced2102-9fd0-4300-9e0a-35d915f1caad-audit-dir\") pod \"apiserver-7bbb656c7d-zrq8d\" (UID: \"dced2102-9fd0-4300-9e0a-35d915f1caad\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zrq8d" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.155137 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tcsz7\" (UniqueName: \"kubernetes.io/projected/5e41d768-3ed4-4760-a0d5-4308d7b13379-kube-api-access-tcsz7\") pod \"cluster-image-registry-operator-dc59b4c8b-s8mgz\" (UID: \"5e41d768-3ed4-4760-a0d5-4308d7b13379\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-s8mgz" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.155166 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c71f28a1-a68d-41c2-a9e6-4984e2e22c74-serving-cert\") pod \"etcd-operator-b45778765-2l7nk\" (UID: \"c71f28a1-a68d-41c2-a9e6-4984e2e22c74\") " pod="openshift-etcd-operator/etcd-operator-b45778765-2l7nk" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.155196 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/005900fa-b395-4c1c-8e62-8e975bd0393c-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-rqxss\" (UID: \"005900fa-b395-4c1c-8e62-8e975bd0393c\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-rqxss" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.155210 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/32cac2d0-56f0-4ba8-86dc-9b57c0fcc11c-console-oauth-config\") pod \"console-f9d7485db-nnqsw\" (UID: \"32cac2d0-56f0-4ba8-86dc-9b57c0fcc11c\") " pod="openshift-console/console-f9d7485db-nnqsw" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.155226 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/c1674a73-a65c-4a8d-9dc5-af576a7af7d4-machine-approver-tls\") pod \"machine-approver-56656f9798-n7fkv\" (UID: \"c1674a73-a65c-4a8d-9dc5-af576a7af7d4\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-n7fkv" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.155600 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.155965 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.156204 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/c71f28a1-a68d-41c2-a9e6-4984e2e22c74-etcd-service-ca\") pod \"etcd-operator-b45778765-2l7nk\" (UID: \"c71f28a1-a68d-41c2-a9e6-4984e2e22c74\") " pod="openshift-etcd-operator/etcd-operator-b45778765-2l7nk" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.154484 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-6nkm6" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.156296 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/1f26ea52-1f97-4d4a-98bd-897c5b3b88c5-audit-dir\") pod \"apiserver-76f77b778f-pdm8d\" (UID: \"1f26ea52-1f97-4d4a-98bd-897c5b3b88c5\") " pod="openshift-apiserver/apiserver-76f77b778f-pdm8d" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.154513 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-mplx7" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.156438 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/1f26ea52-1f97-4d4a-98bd-897c5b3b88c5-node-pullsecrets\") pod \"apiserver-76f77b778f-pdm8d\" (UID: \"1f26ea52-1f97-4d4a-98bd-897c5b3b88c5\") " pod="openshift-apiserver/apiserver-76f77b778f-pdm8d" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.159007 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/b98dcc1e-4c4b-47eb-9ddf-59a138f94247-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-v8zx4\" (UID: \"b98dcc1e-4c4b-47eb-9ddf-59a138f94247\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-v8zx4" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.159111 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1f26ea52-1f97-4d4a-98bd-897c5b3b88c5-serving-cert\") pod \"apiserver-76f77b778f-pdm8d\" (UID: \"1f26ea52-1f97-4d4a-98bd-897c5b3b88c5\") " pod="openshift-apiserver/apiserver-76f77b778f-pdm8d" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.159152 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1f26ea52-1f97-4d4a-98bd-897c5b3b88c5-audit\") pod \"apiserver-76f77b778f-pdm8d\" (UID: \"1f26ea52-1f97-4d4a-98bd-897c5b3b88c5\") " pod="openshift-apiserver/apiserver-76f77b778f-pdm8d" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.159170 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/044562bd-df74-47fa-bc8d-1c652233e9c5-config\") pod \"authentication-operator-69f744f599-6r96v\" (UID: \"044562bd-df74-47fa-bc8d-1c652233e9c5\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-6r96v" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.159187 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/5e41d768-3ed4-4760-a0d5-4308d7b13379-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-s8mgz\" (UID: \"5e41d768-3ed4-4760-a0d5-4308d7b13379\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-s8mgz" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.159209 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/005900fa-b395-4c1c-8e62-8e975bd0393c-config\") pod \"openshift-apiserver-operator-796bbdcf4f-rqxss\" (UID: \"005900fa-b395-4c1c-8e62-8e975bd0393c\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-rqxss" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.159225 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/dced2102-9fd0-4300-9e0a-35d915f1caad-serving-cert\") pod \"apiserver-7bbb656c7d-zrq8d\" (UID: \"dced2102-9fd0-4300-9e0a-35d915f1caad\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zrq8d" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.159242 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1f26ea52-1f97-4d4a-98bd-897c5b3b88c5-image-import-ca\") pod \"apiserver-76f77b778f-pdm8d\" (UID: \"1f26ea52-1f97-4d4a-98bd-897c5b3b88c5\") " pod="openshift-apiserver/apiserver-76f77b778f-pdm8d" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.159258 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1f26ea52-1f97-4d4a-98bd-897c5b3b88c5-trusted-ca-bundle\") pod \"apiserver-76f77b778f-pdm8d\" (UID: \"1f26ea52-1f97-4d4a-98bd-897c5b3b88c5\") " pod="openshift-apiserver/apiserver-76f77b778f-pdm8d" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.159276 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bkbl4\" (UniqueName: \"kubernetes.io/projected/c1674a73-a65c-4a8d-9dc5-af576a7af7d4-kube-api-access-bkbl4\") pod \"machine-approver-56656f9798-n7fkv\" (UID: \"c1674a73-a65c-4a8d-9dc5-af576a7af7d4\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-n7fkv" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.159347 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1f26ea52-1f97-4d4a-98bd-897c5b3b88c5-config\") pod \"apiserver-76f77b778f-pdm8d\" (UID: \"1f26ea52-1f97-4d4a-98bd-897c5b3b88c5\") " pod="openshift-apiserver/apiserver-76f77b778f-pdm8d" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.159368 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fht74\" (UniqueName: \"kubernetes.io/projected/005900fa-b395-4c1c-8e62-8e975bd0393c-kube-api-access-fht74\") pod \"openshift-apiserver-operator-796bbdcf4f-rqxss\" (UID: \"005900fa-b395-4c1c-8e62-8e975bd0393c\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-rqxss" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.159395 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1f26ea52-1f97-4d4a-98bd-897c5b3b88c5-etcd-client\") pod \"apiserver-76f77b778f-pdm8d\" (UID: \"1f26ea52-1f97-4d4a-98bd-897c5b3b88c5\") " pod="openshift-apiserver/apiserver-76f77b778f-pdm8d" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.159412 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/c71f28a1-a68d-41c2-a9e6-4984e2e22c74-etcd-ca\") pod \"etcd-operator-b45778765-2l7nk\" (UID: \"c71f28a1-a68d-41c2-a9e6-4984e2e22c74\") " pod="openshift-etcd-operator/etcd-operator-b45778765-2l7nk" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.159427 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/c71f28a1-a68d-41c2-a9e6-4984e2e22c74-etcd-client\") pod \"etcd-operator-b45778765-2l7nk\" (UID: \"c71f28a1-a68d-41c2-a9e6-4984e2e22c74\") " pod="openshift-etcd-operator/etcd-operator-b45778765-2l7nk" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.159442 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/dced2102-9fd0-4300-9e0a-35d915f1caad-audit-policies\") pod \"apiserver-7bbb656c7d-zrq8d\" (UID: \"dced2102-9fd0-4300-9e0a-35d915f1caad\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zrq8d" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.159461 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fvtdx\" (UniqueName: \"kubernetes.io/projected/1d5466ab-a589-4f7e-ae89-2f494b10f6b1-kube-api-access-fvtdx\") pod \"route-controller-manager-6576b87f9c-d9j8j\" (UID: \"1d5466ab-a589-4f7e-ae89-2f494b10f6b1\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-d9j8j" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.159477 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/044562bd-df74-47fa-bc8d-1c652233e9c5-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-6r96v\" (UID: \"044562bd-df74-47fa-bc8d-1c652233e9c5\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-6r96v" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.159494 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/c1674a73-a65c-4a8d-9dc5-af576a7af7d4-auth-proxy-config\") pod \"machine-approver-56656f9798-n7fkv\" (UID: \"c1674a73-a65c-4a8d-9dc5-af576a7af7d4\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-n7fkv" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.159507 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c1674a73-a65c-4a8d-9dc5-af576a7af7d4-config\") pod \"machine-approver-56656f9798-n7fkv\" (UID: \"c1674a73-a65c-4a8d-9dc5-af576a7af7d4\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-n7fkv" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.159521 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1d5466ab-a589-4f7e-ae89-2f494b10f6b1-serving-cert\") pod \"route-controller-manager-6576b87f9c-d9j8j\" (UID: \"1d5466ab-a589-4f7e-ae89-2f494b10f6b1\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-d9j8j" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.159538 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/dced2102-9fd0-4300-9e0a-35d915f1caad-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-zrq8d\" (UID: \"dced2102-9fd0-4300-9e0a-35d915f1caad\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zrq8d" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.159572 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2zhnp\" (UniqueName: \"kubernetes.io/projected/dced2102-9fd0-4300-9e0a-35d915f1caad-kube-api-access-2zhnp\") pod \"apiserver-7bbb656c7d-zrq8d\" (UID: \"dced2102-9fd0-4300-9e0a-35d915f1caad\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zrq8d" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.159588 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/dced2102-9fd0-4300-9e0a-35d915f1caad-etcd-client\") pod \"apiserver-7bbb656c7d-zrq8d\" (UID: \"dced2102-9fd0-4300-9e0a-35d915f1caad\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zrq8d" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.159607 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c71f28a1-a68d-41c2-a9e6-4984e2e22c74-config\") pod \"etcd-operator-b45778765-2l7nk\" (UID: \"c71f28a1-a68d-41c2-a9e6-4984e2e22c74\") " pod="openshift-etcd-operator/etcd-operator-b45778765-2l7nk" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.159623 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9vhmx\" (UniqueName: \"kubernetes.io/projected/b98dcc1e-4c4b-47eb-9ddf-59a138f94247-kube-api-access-9vhmx\") pod \"cluster-samples-operator-665b6dd947-v8zx4\" (UID: \"b98dcc1e-4c4b-47eb-9ddf-59a138f94247\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-v8zx4" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.159641 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/1306b657-0022-435d-bb72-793f1c1a106b-images\") pod \"machine-api-operator-5694c8668f-9xv4p\" (UID: \"1306b657-0022-435d-bb72-793f1c1a106b\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-9xv4p" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.159656 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/171b00f7-f7cf-41b3-bffd-11ceeb9f2182-metrics-tls\") pod \"dns-operator-744455d44c-fnmb9\" (UID: \"171b00f7-f7cf-41b3-bffd-11ceeb9f2182\") " pod="openshift-dns-operator/dns-operator-744455d44c-fnmb9" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.159678 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9r9tr\" (UniqueName: \"kubernetes.io/projected/1f26ea52-1f97-4d4a-98bd-897c5b3b88c5-kube-api-access-9r9tr\") pod \"apiserver-76f77b778f-pdm8d\" (UID: \"1f26ea52-1f97-4d4a-98bd-897c5b3b88c5\") " pod="openshift-apiserver/apiserver-76f77b778f-pdm8d" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.159693 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/32cac2d0-56f0-4ba8-86dc-9b57c0fcc11c-console-serving-cert\") pod \"console-f9d7485db-nnqsw\" (UID: \"32cac2d0-56f0-4ba8-86dc-9b57c0fcc11c\") " pod="openshift-console/console-f9d7485db-nnqsw" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.159708 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/dced2102-9fd0-4300-9e0a-35d915f1caad-encryption-config\") pod \"apiserver-7bbb656c7d-zrq8d\" (UID: \"dced2102-9fd0-4300-9e0a-35d915f1caad\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zrq8d" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.159723 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/044562bd-df74-47fa-bc8d-1c652233e9c5-service-ca-bundle\") pod \"authentication-operator-69f744f599-6r96v\" (UID: \"044562bd-df74-47fa-bc8d-1c652233e9c5\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-6r96v" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.159743 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1f26ea52-1f97-4d4a-98bd-897c5b3b88c5-encryption-config\") pod \"apiserver-76f77b778f-pdm8d\" (UID: \"1f26ea52-1f97-4d4a-98bd-897c5b3b88c5\") " pod="openshift-apiserver/apiserver-76f77b778f-pdm8d" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.159757 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2jc6t\" (UniqueName: \"kubernetes.io/projected/1306b657-0022-435d-bb72-793f1c1a106b-kube-api-access-2jc6t\") pod \"machine-api-operator-5694c8668f-9xv4p\" (UID: \"1306b657-0022-435d-bb72-793f1c1a106b\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-9xv4p" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.159774 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/32cac2d0-56f0-4ba8-86dc-9b57c0fcc11c-trusted-ca-bundle\") pod \"console-f9d7485db-nnqsw\" (UID: \"32cac2d0-56f0-4ba8-86dc-9b57c0fcc11c\") " pod="openshift-console/console-f9d7485db-nnqsw" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.156476 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/1f26ea52-1f97-4d4a-98bd-897c5b3b88c5-node-pullsecrets\") pod \"apiserver-76f77b778f-pdm8d\" (UID: \"1f26ea52-1f97-4d4a-98bd-897c5b3b88c5\") " pod="openshift-apiserver/apiserver-76f77b778f-pdm8d" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.156597 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1f26ea52-1f97-4d4a-98bd-897c5b3b88c5-etcd-serving-ca\") pod \"apiserver-76f77b778f-pdm8d\" (UID: \"1f26ea52-1f97-4d4a-98bd-897c5b3b88c5\") " pod="openshift-apiserver/apiserver-76f77b778f-pdm8d" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.161831 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1f26ea52-1f97-4d4a-98bd-897c5b3b88c5-audit\") pod \"apiserver-76f77b778f-pdm8d\" (UID: \"1f26ea52-1f97-4d4a-98bd-897c5b3b88c5\") " pod="openshift-apiserver/apiserver-76f77b778f-pdm8d" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.162082 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c71f28a1-a68d-41c2-a9e6-4984e2e22c74-config\") pod \"etcd-operator-b45778765-2l7nk\" (UID: \"c71f28a1-a68d-41c2-a9e6-4984e2e22c74\") " pod="openshift-etcd-operator/etcd-operator-b45778765-2l7nk" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.162294 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/c71f28a1-a68d-41c2-a9e6-4984e2e22c74-etcd-ca\") pod \"etcd-operator-b45778765-2l7nk\" (UID: \"c71f28a1-a68d-41c2-a9e6-4984e2e22c74\") " pod="openshift-etcd-operator/etcd-operator-b45778765-2l7nk" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.163122 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1f26ea52-1f97-4d4a-98bd-897c5b3b88c5-trusted-ca-bundle\") pod \"apiserver-76f77b778f-pdm8d\" (UID: \"1f26ea52-1f97-4d4a-98bd-897c5b3b88c5\") " pod="openshift-apiserver/apiserver-76f77b778f-pdm8d" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.164224 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1f26ea52-1f97-4d4a-98bd-897c5b3b88c5-config\") pod \"apiserver-76f77b778f-pdm8d\" (UID: \"1f26ea52-1f97-4d4a-98bd-897c5b3b88c5\") " pod="openshift-apiserver/apiserver-76f77b778f-pdm8d" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.164956 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1f26ea52-1f97-4d4a-98bd-897c5b3b88c5-image-import-ca\") pod \"apiserver-76f77b778f-pdm8d\" (UID: \"1f26ea52-1f97-4d4a-98bd-897c5b3b88c5\") " pod="openshift-apiserver/apiserver-76f77b778f-pdm8d" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.166046 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.168005 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1f26ea52-1f97-4d4a-98bd-897c5b3b88c5-serving-cert\") pod \"apiserver-76f77b778f-pdm8d\" (UID: \"1f26ea52-1f97-4d4a-98bd-897c5b3b88c5\") " pod="openshift-apiserver/apiserver-76f77b778f-pdm8d" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.168352 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-2k6jt"] Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.168968 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-2l7nk"] Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.168996 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-6r96v"] Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.169078 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-2k6jt" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.170079 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1f26ea52-1f97-4d4a-98bd-897c5b3b88c5-etcd-client\") pod \"apiserver-76f77b778f-pdm8d\" (UID: \"1f26ea52-1f97-4d4a-98bd-897c5b3b88c5\") " pod="openshift-apiserver/apiserver-76f77b778f-pdm8d" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.173115 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/c71f28a1-a68d-41c2-a9e6-4984e2e22c74-etcd-client\") pod \"etcd-operator-b45778765-2l7nk\" (UID: \"c71f28a1-a68d-41c2-a9e6-4984e2e22c74\") " pod="openshift-etcd-operator/etcd-operator-b45778765-2l7nk" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.176097 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"serviceca" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.179692 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.191991 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1f26ea52-1f97-4d4a-98bd-897c5b3b88c5-encryption-config\") pod \"apiserver-76f77b778f-pdm8d\" (UID: \"1f26ea52-1f97-4d4a-98bd-897c5b3b88c5\") " pod="openshift-apiserver/apiserver-76f77b778f-pdm8d" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.192416 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c71f28a1-a68d-41c2-a9e6-4984e2e22c74-serving-cert\") pod \"etcd-operator-b45778765-2l7nk\" (UID: \"c71f28a1-a68d-41c2-a9e6-4984e2e22c74\") " pod="openshift-etcd-operator/etcd-operator-b45778765-2l7nk" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.202363 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.204962 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.211005 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"pruner-dockercfg-p7bcw" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.211179 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.211359 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-nnqsw"] Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.217015 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-zrq8d"] Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.217085 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-d9j8j"] Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.225190 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.229077 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-q9xc9"] Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.236676 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.237206 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-psjs7" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.237787 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-s8mgz"] Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.241003 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.244297 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-rqxss"] Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.246772 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-t8c6x"] Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.248938 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-9xv4p"] Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.250609 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-fnmb9"] Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.252387 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-4t6gm"] Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.253494 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-5rr7c"] Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.254409 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-jm8db"] Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.256052 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-v8zx4"] Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.256785 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-tgbjg"] Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.257702 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.258062 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-sh4ps"] Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.260954 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-pdm8d"] Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.261585 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-sshl5"] Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.262845 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-tv2n7"] Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.263262 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2zhnp\" (UniqueName: \"kubernetes.io/projected/dced2102-9fd0-4300-9e0a-35d915f1caad-kube-api-access-2zhnp\") pod \"apiserver-7bbb656c7d-zrq8d\" (UID: \"dced2102-9fd0-4300-9e0a-35d915f1caad\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zrq8d" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.263291 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/f4bdfe91-48ef-4a44-99ba-d7ab90df9ec0-images\") pod \"machine-config-operator-74547568cd-4t6gm\" (UID: \"f4bdfe91-48ef-4a44-99ba-d7ab90df9ec0\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-4t6gm" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.263310 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/dced2102-9fd0-4300-9e0a-35d915f1caad-etcd-client\") pod \"apiserver-7bbb656c7d-zrq8d\" (UID: \"dced2102-9fd0-4300-9e0a-35d915f1caad\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zrq8d" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.263326 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/7442ef1b-27ea-4166-8457-5332c4c8f363-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-mplx7\" (UID: \"7442ef1b-27ea-4166-8457-5332c4c8f363\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-mplx7" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.263342 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/9fc59286-0388-4519-afc7-f2c8cf80ab40-serviceca\") pod \"image-pruner-29560320-s9q72\" (UID: \"9fc59286-0388-4519-afc7-f2c8cf80ab40\") " pod="openshift-image-registry/image-pruner-29560320-s9q72" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.263357 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/1306b657-0022-435d-bb72-793f1c1a106b-images\") pod \"machine-api-operator-5694c8668f-9xv4p\" (UID: \"1306b657-0022-435d-bb72-793f1c1a106b\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-9xv4p" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.263371 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/171b00f7-f7cf-41b3-bffd-11ceeb9f2182-metrics-tls\") pod \"dns-operator-744455d44c-fnmb9\" (UID: \"171b00f7-f7cf-41b3-bffd-11ceeb9f2182\") " pod="openshift-dns-operator/dns-operator-744455d44c-fnmb9" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.263386 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-spn2v\" (UniqueName: \"kubernetes.io/projected/dc6dfded-ec9e-4a6f-a97b-3bb4cf8a149a-kube-api-access-spn2v\") pod \"control-plane-machine-set-operator-78cbb6b69f-mwhpz\" (UID: \"dc6dfded-ec9e-4a6f-a97b-3bb4cf8a149a\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-mwhpz" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.263404 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/681ca8e4-f909-4e8b-9f35-5ab8ca382e44-metrics-certs\") pod \"router-default-5444994796-gvk75\" (UID: \"681ca8e4-f909-4e8b-9f35-5ab8ca382e44\") " pod="openshift-ingress/router-default-5444994796-gvk75" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.263420 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wwn7z\" (UniqueName: \"kubernetes.io/projected/79ec9746-96c0-4fcd-b367-a42b6950145a-kube-api-access-wwn7z\") pod \"service-ca-operator-777779d784-vvdz2\" (UID: \"79ec9746-96c0-4fcd-b367-a42b6950145a\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-vvdz2" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.263436 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/044562bd-df74-47fa-bc8d-1c652233e9c5-service-ca-bundle\") pod \"authentication-operator-69f744f599-6r96v\" (UID: \"044562bd-df74-47fa-bc8d-1c652233e9c5\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-6r96v" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.263454 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/32cac2d0-56f0-4ba8-86dc-9b57c0fcc11c-console-serving-cert\") pod \"console-f9d7485db-nnqsw\" (UID: \"32cac2d0-56f0-4ba8-86dc-9b57c0fcc11c\") " pod="openshift-console/console-f9d7485db-nnqsw" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.263467 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/32cac2d0-56f0-4ba8-86dc-9b57c0fcc11c-trusted-ca-bundle\") pod \"console-f9d7485db-nnqsw\" (UID: \"32cac2d0-56f0-4ba8-86dc-9b57c0fcc11c\") " pod="openshift-console/console-f9d7485db-nnqsw" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.263481 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4f90d894-17c6-4800-a438-737fe8619e01-serving-cert\") pod \"openshift-config-operator-7777fb866f-d4vrm\" (UID: \"4f90d894-17c6-4800-a438-737fe8619e01\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-d4vrm" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.263498 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pg5rd\" (UniqueName: \"kubernetes.io/projected/0a98bca9-38d3-4382-a6d6-8410170f7d81-kube-api-access-pg5rd\") pod \"openshift-controller-manager-operator-756b6f6bc6-sh4ps\" (UID: \"0a98bca9-38d3-4382-a6d6-8410170f7d81\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-sh4ps" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.263512 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/681ca8e4-f909-4e8b-9f35-5ab8ca382e44-default-certificate\") pod \"router-default-5444994796-gvk75\" (UID: \"681ca8e4-f909-4e8b-9f35-5ab8ca382e44\") " pod="openshift-ingress/router-default-5444994796-gvk75" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.263528 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wgzct\" (UniqueName: \"kubernetes.io/projected/9e737c04-a2db-452e-adc7-fa383e158b53-kube-api-access-wgzct\") pod \"package-server-manager-789f6589d5-jm8db\" (UID: \"9e737c04-a2db-452e-adc7-fa383e158b53\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-jm8db" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.263562 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3eb76779-61d8-4977-8839-083fcf6cd69b-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-8tt9t\" (UID: \"3eb76779-61d8-4977-8839-083fcf6cd69b\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-8tt9t" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.263585 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/32cac2d0-56f0-4ba8-86dc-9b57c0fcc11c-console-config\") pod \"console-f9d7485db-nnqsw\" (UID: \"32cac2d0-56f0-4ba8-86dc-9b57c0fcc11c\") " pod="openshift-console/console-f9d7485db-nnqsw" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.263602 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/79ec9746-96c0-4fcd-b367-a42b6950145a-config\") pod \"service-ca-operator-777779d784-vvdz2\" (UID: \"79ec9746-96c0-4fcd-b367-a42b6950145a\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-vvdz2" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.263621 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ef3a7303-57a8-461f-86c1-fd3f7882e93b-trusted-ca\") pod \"ingress-operator-5b745b69d9-l648b\" (UID: \"ef3a7303-57a8-461f-86c1-fd3f7882e93b\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-l648b" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.263640 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j62nk\" (UniqueName: \"kubernetes.io/projected/0386f821-c5fb-4dfd-acaf-706e214a57c0-kube-api-access-j62nk\") pod \"migrator-59844c95c7-fwkzt\" (UID: \"0386f821-c5fb-4dfd-acaf-706e214a57c0\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-fwkzt" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.263660 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/dc6dfded-ec9e-4a6f-a97b-3bb4cf8a149a-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-mwhpz\" (UID: \"dc6dfded-ec9e-4a6f-a97b-3bb4cf8a149a\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-mwhpz" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.263681 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/ef3a7303-57a8-461f-86c1-fd3f7882e93b-metrics-tls\") pod \"ingress-operator-5b745b69d9-l648b\" (UID: \"ef3a7303-57a8-461f-86c1-fd3f7882e93b\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-l648b" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.263697 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rclzn\" (UniqueName: \"kubernetes.io/projected/a1bc4b9a-e741-4f63-81e8-fdce3da0a5ea-kube-api-access-rclzn\") pod \"olm-operator-6b444d44fb-nsxl4\" (UID: \"a1bc4b9a-e741-4f63-81e8-fdce3da0a5ea\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-nsxl4" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.263717 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/32cac2d0-56f0-4ba8-86dc-9b57c0fcc11c-service-ca\") pod \"console-f9d7485db-nnqsw\" (UID: \"32cac2d0-56f0-4ba8-86dc-9b57c0fcc11c\") " pod="openshift-console/console-f9d7485db-nnqsw" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.263733 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zgh99\" (UniqueName: \"kubernetes.io/projected/02854230-6165-4f22-8780-d8591b991132-kube-api-access-zgh99\") pod \"marketplace-operator-79b997595-8226q\" (UID: \"02854230-6165-4f22-8780-d8591b991132\") " pod="openshift-marketplace/marketplace-operator-79b997595-8226q" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.263750 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/34f93b2b-cc36-4965-992c-825bf2595e1e-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-tgbjg\" (UID: \"34f93b2b-cc36-4965-992c-825bf2595e1e\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-tgbjg" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.263764 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pkph5\" (UniqueName: \"kubernetes.io/projected/db720c64-b1fa-48c9-a4b7-fc42f8ca47fd-kube-api-access-pkph5\") pod \"service-ca-9c57cc56f-6nkm6\" (UID: \"db720c64-b1fa-48c9-a4b7-fc42f8ca47fd\") " pod="openshift-service-ca/service-ca-9c57cc56f-6nkm6" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.263780 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/064b42ee-720b-456c-8ffe-a247f827befc-apiservice-cert\") pod \"packageserver-d55dfcdfc-zn6w7\" (UID: \"064b42ee-720b-456c-8ffe-a247f827befc\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-zn6w7" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.263795 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4cc341f9-55c7-4bce-a0e3-24df68ca7f0e-trusted-ca\") pod \"console-operator-58897d9998-q9xc9\" (UID: \"4cc341f9-55c7-4bce-a0e3-24df68ca7f0e\") " pod="openshift-console-operator/console-operator-58897d9998-q9xc9" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.263809 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/4f90d894-17c6-4800-a438-737fe8619e01-available-featuregates\") pod \"openshift-config-operator-7777fb866f-d4vrm\" (UID: \"4f90d894-17c6-4800-a438-737fe8619e01\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-d4vrm" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.263834 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/1d5466ab-a589-4f7e-ae89-2f494b10f6b1-client-ca\") pod \"route-controller-manager-6576b87f9c-d9j8j\" (UID: \"1d5466ab-a589-4f7e-ae89-2f494b10f6b1\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-d9j8j" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.263850 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/044562bd-df74-47fa-bc8d-1c652233e9c5-serving-cert\") pod \"authentication-operator-69f744f599-6r96v\" (UID: \"044562bd-df74-47fa-bc8d-1c652233e9c5\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-6r96v" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.263864 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/59c840a8-f288-44ed-83d3-34d47041c6c6-config\") pod \"controller-manager-879f6c89f-tv2n7\" (UID: \"59c840a8-f288-44ed-83d3-34d47041c6c6\") " pod="openshift-controller-manager/controller-manager-879f6c89f-tv2n7" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.263878 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/db720c64-b1fa-48c9-a4b7-fc42f8ca47fd-signing-cabundle\") pod \"service-ca-9c57cc56f-6nkm6\" (UID: \"db720c64-b1fa-48c9-a4b7-fc42f8ca47fd\") " pod="openshift-service-ca/service-ca-9c57cc56f-6nkm6" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.263897 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/1306b657-0022-435d-bb72-793f1c1a106b-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-9xv4p\" (UID: \"1306b657-0022-435d-bb72-793f1c1a106b\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-9xv4p" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.263915 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/5e41d768-3ed4-4760-a0d5-4308d7b13379-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-s8mgz\" (UID: \"5e41d768-3ed4-4760-a0d5-4308d7b13379\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-s8mgz" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.263936 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dq52v\" (UniqueName: \"kubernetes.io/projected/55e76e8f-7d69-4f55-81f8-45c9c612876b-kube-api-access-dq52v\") pod \"auto-csr-approver-29560330-44pts\" (UID: \"55e76e8f-7d69-4f55-81f8-45c9c612876b\") " pod="openshift-infra/auto-csr-approver-29560330-44pts" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.263956 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/dced2102-9fd0-4300-9e0a-35d915f1caad-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-zrq8d\" (UID: \"dced2102-9fd0-4300-9e0a-35d915f1caad\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zrq8d" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.263974 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/dced2102-9fd0-4300-9e0a-35d915f1caad-audit-dir\") pod \"apiserver-7bbb656c7d-zrq8d\" (UID: \"dced2102-9fd0-4300-9e0a-35d915f1caad\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zrq8d" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.263989 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xdmv5\" (UniqueName: \"kubernetes.io/projected/171b00f7-f7cf-41b3-bffd-11ceeb9f2182-kube-api-access-xdmv5\") pod \"dns-operator-744455d44c-fnmb9\" (UID: \"171b00f7-f7cf-41b3-bffd-11ceeb9f2182\") " pod="openshift-dns-operator/dns-operator-744455d44c-fnmb9" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.264005 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/29bcfe72-6ef1-4087-9feb-787fdba3d2d7-profile-collector-cert\") pod \"catalog-operator-68c6474976-gtwmf\" (UID: \"29bcfe72-6ef1-4087-9feb-787fdba3d2d7\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-gtwmf" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.264020 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ct69q\" (UniqueName: \"kubernetes.io/projected/7442ef1b-27ea-4166-8457-5332c4c8f363-kube-api-access-ct69q\") pod \"multus-admission-controller-857f4d67dd-mplx7\" (UID: \"7442ef1b-27ea-4166-8457-5332c4c8f363\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-mplx7" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.264035 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/005900fa-b395-4c1c-8e62-8e975bd0393c-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-rqxss\" (UID: \"005900fa-b395-4c1c-8e62-8e975bd0393c\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-rqxss" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.264050 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tcsz7\" (UniqueName: \"kubernetes.io/projected/5e41d768-3ed4-4760-a0d5-4308d7b13379-kube-api-access-tcsz7\") pod \"cluster-image-registry-operator-dc59b4c8b-s8mgz\" (UID: \"5e41d768-3ed4-4760-a0d5-4308d7b13379\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-s8mgz" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.264069 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/32cac2d0-56f0-4ba8-86dc-9b57c0fcc11c-console-oauth-config\") pod \"console-f9d7485db-nnqsw\" (UID: \"32cac2d0-56f0-4ba8-86dc-9b57c0fcc11c\") " pod="openshift-console/console-f9d7485db-nnqsw" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.264083 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/681ca8e4-f909-4e8b-9f35-5ab8ca382e44-stats-auth\") pod \"router-default-5444994796-gvk75\" (UID: \"681ca8e4-f909-4e8b-9f35-5ab8ca382e44\") " pod="openshift-ingress/router-default-5444994796-gvk75" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.264108 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/b98dcc1e-4c4b-47eb-9ddf-59a138f94247-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-v8zx4\" (UID: \"b98dcc1e-4c4b-47eb-9ddf-59a138f94247\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-v8zx4" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.264125 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/069c0b04-3302-488c-84fc-eeccac5fae9b-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-fk9l7\" (UID: \"069c0b04-3302-488c-84fc-eeccac5fae9b\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-fk9l7" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.264143 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/79ec9746-96c0-4fcd-b367-a42b6950145a-serving-cert\") pod \"service-ca-operator-777779d784-vvdz2\" (UID: \"79ec9746-96c0-4fcd-b367-a42b6950145a\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-vvdz2" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.264159 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/a1bc4b9a-e741-4f63-81e8-fdce3da0a5ea-srv-cert\") pod \"olm-operator-6b444d44fb-nsxl4\" (UID: \"a1bc4b9a-e741-4f63-81e8-fdce3da0a5ea\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-nsxl4" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.264176 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bkbl4\" (UniqueName: \"kubernetes.io/projected/c1674a73-a65c-4a8d-9dc5-af576a7af7d4-kube-api-access-bkbl4\") pod \"machine-approver-56656f9798-n7fkv\" (UID: \"c1674a73-a65c-4a8d-9dc5-af576a7af7d4\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-n7fkv" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.264191 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/064b42ee-720b-456c-8ffe-a247f827befc-tmpfs\") pod \"packageserver-d55dfcdfc-zn6w7\" (UID: \"064b42ee-720b-456c-8ffe-a247f827befc\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-zn6w7" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.264207 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/93dadffe-0353-4301-bb97-31b034d3dc64-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-t8c6x\" (UID: \"93dadffe-0353-4301-bb97-31b034d3dc64\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-t8c6x" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.264232 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/f4bdfe91-48ef-4a44-99ba-d7ab90df9ec0-auth-proxy-config\") pod \"machine-config-operator-74547568cd-4t6gm\" (UID: \"f4bdfe91-48ef-4a44-99ba-d7ab90df9ec0\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-4t6gm" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.264247 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/59c840a8-f288-44ed-83d3-34d47041c6c6-client-ca\") pod \"controller-manager-879f6c89f-tv2n7\" (UID: \"59c840a8-f288-44ed-83d3-34d47041c6c6\") " pod="openshift-controller-manager/controller-manager-879f6c89f-tv2n7" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.264260 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3eb76779-61d8-4977-8839-083fcf6cd69b-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-8tt9t\" (UID: \"3eb76779-61d8-4977-8839-083fcf6cd69b\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-8tt9t" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.266133 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/1306b657-0022-435d-bb72-793f1c1a106b-images\") pod \"machine-api-operator-5694c8668f-9xv4p\" (UID: \"1306b657-0022-435d-bb72-793f1c1a106b\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-9xv4p" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.266608 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/dced2102-9fd0-4300-9e0a-35d915f1caad-audit-dir\") pod \"apiserver-7bbb656c7d-zrq8d\" (UID: \"dced2102-9fd0-4300-9e0a-35d915f1caad\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zrq8d" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.267536 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/044562bd-df74-47fa-bc8d-1c652233e9c5-service-ca-bundle\") pod \"authentication-operator-69f744f599-6r96v\" (UID: \"044562bd-df74-47fa-bc8d-1c652233e9c5\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-6r96v" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.268121 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/dced2102-9fd0-4300-9e0a-35d915f1caad-etcd-client\") pod \"apiserver-7bbb656c7d-zrq8d\" (UID: \"dced2102-9fd0-4300-9e0a-35d915f1caad\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zrq8d" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.268178 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n5mpc\" (UniqueName: \"kubernetes.io/projected/681ca8e4-f909-4e8b-9f35-5ab8ca382e44-kube-api-access-n5mpc\") pod \"router-default-5444994796-gvk75\" (UID: \"681ca8e4-f909-4e8b-9f35-5ab8ca382e44\") " pod="openshift-ingress/router-default-5444994796-gvk75" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.268196 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4wvzp\" (UniqueName: \"kubernetes.io/projected/34f93b2b-cc36-4965-992c-825bf2595e1e-kube-api-access-4wvzp\") pod \"machine-config-controller-84d6567774-tgbjg\" (UID: \"34f93b2b-cc36-4965-992c-825bf2595e1e\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-tgbjg" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.268213 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/f4bdfe91-48ef-4a44-99ba-d7ab90df9ec0-proxy-tls\") pod \"machine-config-operator-74547568cd-4t6gm\" (UID: \"f4bdfe91-48ef-4a44-99ba-d7ab90df9ec0\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-4t6gm" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.268230 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/9397185e-a9e3-4ef4-b0be-d9dc9208adff-secret-volume\") pod \"collect-profiles-29560320-4hk5d\" (UID: \"9397185e-a9e3-4ef4-b0be-d9dc9208adff\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29560320-4hk5d" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.268255 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1d5466ab-a589-4f7e-ae89-2f494b10f6b1-serving-cert\") pod \"route-controller-manager-6576b87f9c-d9j8j\" (UID: \"1d5466ab-a589-4f7e-ae89-2f494b10f6b1\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-d9j8j" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.268271 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/044562bd-df74-47fa-bc8d-1c652233e9c5-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-6r96v\" (UID: \"044562bd-df74-47fa-bc8d-1c652233e9c5\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-6r96v" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.268292 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9vkgk\" (UniqueName: \"kubernetes.io/projected/0ec3cdc0-f024-43cf-b520-7d2437e0f8df-kube-api-access-9vkgk\") pod \"downloads-7954f5f757-5rr7c\" (UID: \"0ec3cdc0-f024-43cf-b520-7d2437e0f8df\") " pod="openshift-console/downloads-7954f5f757-5rr7c" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.268307 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/069c0b04-3302-488c-84fc-eeccac5fae9b-config\") pod \"kube-controller-manager-operator-78b949d7b-fk9l7\" (UID: \"069c0b04-3302-488c-84fc-eeccac5fae9b\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-fk9l7" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.268327 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/069c0b04-3302-488c-84fc-eeccac5fae9b-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-fk9l7\" (UID: \"069c0b04-3302-488c-84fc-eeccac5fae9b\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-fk9l7" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.268355 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hxxlh\" (UniqueName: \"kubernetes.io/projected/f4bdfe91-48ef-4a44-99ba-d7ab90df9ec0-kube-api-access-hxxlh\") pod \"machine-config-operator-74547568cd-4t6gm\" (UID: \"f4bdfe91-48ef-4a44-99ba-d7ab90df9ec0\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-4t6gm" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.268372 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/aa9721ae-1aaa-4d49-9f05-5aabb9ab31d9-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-ll5r8\" (UID: \"aa9721ae-1aaa-4d49-9f05-5aabb9ab31d9\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-ll5r8" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.268387 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-86fb4\" (UniqueName: \"kubernetes.io/projected/4cc341f9-55c7-4bce-a0e3-24df68ca7f0e-kube-api-access-86fb4\") pod \"console-operator-58897d9998-q9xc9\" (UID: \"4cc341f9-55c7-4bce-a0e3-24df68ca7f0e\") " pod="openshift-console-operator/console-operator-58897d9998-q9xc9" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.268407 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9vhmx\" (UniqueName: \"kubernetes.io/projected/b98dcc1e-4c4b-47eb-9ddf-59a138f94247-kube-api-access-9vhmx\") pod \"cluster-samples-operator-665b6dd947-v8zx4\" (UID: \"b98dcc1e-4c4b-47eb-9ddf-59a138f94247\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-v8zx4" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.268563 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/59c840a8-f288-44ed-83d3-34d47041c6c6-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-tv2n7\" (UID: \"59c840a8-f288-44ed-83d3-34d47041c6c6\") " pod="openshift-controller-manager/controller-manager-879f6c89f-tv2n7" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.268605 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zsbsw\" (UniqueName: \"kubernetes.io/projected/59c840a8-f288-44ed-83d3-34d47041c6c6-kube-api-access-zsbsw\") pod \"controller-manager-879f6c89f-tv2n7\" (UID: \"59c840a8-f288-44ed-83d3-34d47041c6c6\") " pod="openshift-controller-manager/controller-manager-879f6c89f-tv2n7" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.268626 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/dced2102-9fd0-4300-9e0a-35d915f1caad-encryption-config\") pod \"apiserver-7bbb656c7d-zrq8d\" (UID: \"dced2102-9fd0-4300-9e0a-35d915f1caad\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zrq8d" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.268645 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2jc6t\" (UniqueName: \"kubernetes.io/projected/1306b657-0022-435d-bb72-793f1c1a106b-kube-api-access-2jc6t\") pod \"machine-api-operator-5694c8668f-9xv4p\" (UID: \"1306b657-0022-435d-bb72-793f1c1a106b\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-9xv4p" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.268661 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/db720c64-b1fa-48c9-a4b7-fc42f8ca47fd-signing-key\") pod \"service-ca-9c57cc56f-6nkm6\" (UID: \"db720c64-b1fa-48c9-a4b7-fc42f8ca47fd\") " pod="openshift-service-ca/service-ca-9c57cc56f-6nkm6" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.268678 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xxqlb\" (UniqueName: \"kubernetes.io/projected/29bcfe72-6ef1-4087-9feb-787fdba3d2d7-kube-api-access-xxqlb\") pod \"catalog-operator-68c6474976-gtwmf\" (UID: \"29bcfe72-6ef1-4087-9feb-787fdba3d2d7\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-gtwmf" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.268693 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/a1bc4b9a-e741-4f63-81e8-fdce3da0a5ea-profile-collector-cert\") pod \"olm-operator-6b444d44fb-nsxl4\" (UID: \"a1bc4b9a-e741-4f63-81e8-fdce3da0a5ea\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-nsxl4" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.268707 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/9397185e-a9e3-4ef4-b0be-d9dc9208adff-config-volume\") pod \"collect-profiles-29560320-4hk5d\" (UID: \"9397185e-a9e3-4ef4-b0be-d9dc9208adff\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29560320-4hk5d" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.268727 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/32cac2d0-56f0-4ba8-86dc-9b57c0fcc11c-oauth-serving-cert\") pod \"console-f9d7485db-nnqsw\" (UID: \"32cac2d0-56f0-4ba8-86dc-9b57c0fcc11c\") " pod="openshift-console/console-f9d7485db-nnqsw" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.268744 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1d5466ab-a589-4f7e-ae89-2f494b10f6b1-config\") pod \"route-controller-manager-6576b87f9c-d9j8j\" (UID: \"1d5466ab-a589-4f7e-ae89-2f494b10f6b1\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-d9j8j" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.268763 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/ef3a7303-57a8-461f-86c1-fd3f7882e93b-bound-sa-token\") pod \"ingress-operator-5b745b69d9-l648b\" (UID: \"ef3a7303-57a8-461f-86c1-fd3f7882e93b\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-l648b" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.268782 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0a98bca9-38d3-4382-a6d6-8410170f7d81-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-sh4ps\" (UID: \"0a98bca9-38d3-4382-a6d6-8410170f7d81\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-sh4ps" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.268798 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fskkz\" (UniqueName: \"kubernetes.io/projected/ef3a7303-57a8-461f-86c1-fd3f7882e93b-kube-api-access-fskkz\") pod \"ingress-operator-5b745b69d9-l648b\" (UID: \"ef3a7303-57a8-461f-86c1-fd3f7882e93b\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-l648b" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.268817 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1306b657-0022-435d-bb72-793f1c1a106b-config\") pod \"machine-api-operator-5694c8668f-9xv4p\" (UID: \"1306b657-0022-435d-bb72-793f1c1a106b\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-9xv4p" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.268845 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xnq8l\" (UniqueName: \"kubernetes.io/projected/32cac2d0-56f0-4ba8-86dc-9b57c0fcc11c-kube-api-access-xnq8l\") pod \"console-f9d7485db-nnqsw\" (UID: \"32cac2d0-56f0-4ba8-86dc-9b57c0fcc11c\") " pod="openshift-console/console-f9d7485db-nnqsw" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.268861 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/5e41d768-3ed4-4760-a0d5-4308d7b13379-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-s8mgz\" (UID: \"5e41d768-3ed4-4760-a0d5-4308d7b13379\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-s8mgz" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.269428 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/005900fa-b395-4c1c-8e62-8e975bd0393c-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-rqxss\" (UID: \"005900fa-b395-4c1c-8e62-8e975bd0393c\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-rqxss" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.269658 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/32cac2d0-56f0-4ba8-86dc-9b57c0fcc11c-console-config\") pod \"console-f9d7485db-nnqsw\" (UID: \"32cac2d0-56f0-4ba8-86dc-9b57c0fcc11c\") " pod="openshift-console/console-f9d7485db-nnqsw" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.269712 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-gtwmf"] Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.270104 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-ckvwn"] Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.270113 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/32cac2d0-56f0-4ba8-86dc-9b57c0fcc11c-trusted-ca-bundle\") pod \"console-f9d7485db-nnqsw\" (UID: \"32cac2d0-56f0-4ba8-86dc-9b57c0fcc11c\") " pod="openshift-console/console-f9d7485db-nnqsw" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.270210 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/dced2102-9fd0-4300-9e0a-35d915f1caad-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-zrq8d\" (UID: \"dced2102-9fd0-4300-9e0a-35d915f1caad\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zrq8d" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.270378 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/044562bd-df74-47fa-bc8d-1c652233e9c5-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-6r96v\" (UID: \"044562bd-df74-47fa-bc8d-1c652233e9c5\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-6r96v" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.270864 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/1d5466ab-a589-4f7e-ae89-2f494b10f6b1-client-ca\") pod \"route-controller-manager-6576b87f9c-d9j8j\" (UID: \"1d5466ab-a589-4f7e-ae89-2f494b10f6b1\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-d9j8j" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.270953 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/32cac2d0-56f0-4ba8-86dc-9b57c0fcc11c-service-ca\") pod \"console-f9d7485db-nnqsw\" (UID: \"32cac2d0-56f0-4ba8-86dc-9b57c0fcc11c\") " pod="openshift-console/console-f9d7485db-nnqsw" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.271016 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/dced2102-9fd0-4300-9e0a-35d915f1caad-encryption-config\") pod \"apiserver-7bbb656c7d-zrq8d\" (UID: \"dced2102-9fd0-4300-9e0a-35d915f1caad\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zrq8d" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.271772 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/32cac2d0-56f0-4ba8-86dc-9b57c0fcc11c-oauth-serving-cert\") pod \"console-f9d7485db-nnqsw\" (UID: \"32cac2d0-56f0-4ba8-86dc-9b57c0fcc11c\") " pod="openshift-console/console-f9d7485db-nnqsw" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.272584 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/32cac2d0-56f0-4ba8-86dc-9b57c0fcc11c-console-serving-cert\") pod \"console-f9d7485db-nnqsw\" (UID: \"32cac2d0-56f0-4ba8-86dc-9b57c0fcc11c\") " pod="openshift-console/console-f9d7485db-nnqsw" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.272688 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nfjfn\" (UniqueName: \"kubernetes.io/projected/4f90d894-17c6-4800-a438-737fe8619e01-kube-api-access-nfjfn\") pod \"openshift-config-operator-7777fb866f-d4vrm\" (UID: \"4f90d894-17c6-4800-a438-737fe8619e01\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-d4vrm" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.272753 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1d5466ab-a589-4f7e-ae89-2f494b10f6b1-config\") pod \"route-controller-manager-6576b87f9c-d9j8j\" (UID: \"1d5466ab-a589-4f7e-ae89-2f494b10f6b1\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-d9j8j" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.272775 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pzj4l\" (UniqueName: \"kubernetes.io/projected/044562bd-df74-47fa-bc8d-1c652233e9c5-kube-api-access-pzj4l\") pod \"authentication-operator-69f744f599-6r96v\" (UID: \"044562bd-df74-47fa-bc8d-1c652233e9c5\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-6r96v" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.272808 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-ll5r8"] Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.272838 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0a98bca9-38d3-4382-a6d6-8410170f7d81-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-sh4ps\" (UID: \"0a98bca9-38d3-4382-a6d6-8410170f7d81\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-sh4ps" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.273166 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/9e737c04-a2db-452e-adc7-fa383e158b53-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-jm8db\" (UID: \"9e737c04-a2db-452e-adc7-fa383e158b53\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-jm8db" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.273228 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/34f93b2b-cc36-4965-992c-825bf2595e1e-proxy-tls\") pod \"machine-config-controller-84d6567774-tgbjg\" (UID: \"34f93b2b-cc36-4965-992c-825bf2595e1e\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-tgbjg" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.273276 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/c1674a73-a65c-4a8d-9dc5-af576a7af7d4-machine-approver-tls\") pod \"machine-approver-56656f9798-n7fkv\" (UID: \"c1674a73-a65c-4a8d-9dc5-af576a7af7d4\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-n7fkv" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.273354 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4cc341f9-55c7-4bce-a0e3-24df68ca7f0e-serving-cert\") pod \"console-operator-58897d9998-q9xc9\" (UID: \"4cc341f9-55c7-4bce-a0e3-24df68ca7f0e\") " pod="openshift-console-operator/console-operator-58897d9998-q9xc9" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.273386 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/02854230-6165-4f22-8780-d8591b991132-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-8226q\" (UID: \"02854230-6165-4f22-8780-d8591b991132\") " pod="openshift-marketplace/marketplace-operator-79b997595-8226q" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.273407 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/064b42ee-720b-456c-8ffe-a247f827befc-webhook-cert\") pod \"packageserver-d55dfcdfc-zn6w7\" (UID: \"064b42ee-720b-456c-8ffe-a247f827befc\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-zn6w7" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.273428 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pdffs\" (UniqueName: \"kubernetes.io/projected/9397185e-a9e3-4ef4-b0be-d9dc9208adff-kube-api-access-pdffs\") pod \"collect-profiles-29560320-4hk5d\" (UID: \"9397185e-a9e3-4ef4-b0be-d9dc9208adff\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29560320-4hk5d" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.273445 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4cc341f9-55c7-4bce-a0e3-24df68ca7f0e-config\") pod \"console-operator-58897d9998-q9xc9\" (UID: \"4cc341f9-55c7-4bce-a0e3-24df68ca7f0e\") " pod="openshift-console-operator/console-operator-58897d9998-q9xc9" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.273466 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/02854230-6165-4f22-8780-d8591b991132-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-8226q\" (UID: \"02854230-6165-4f22-8780-d8591b991132\") " pod="openshift-marketplace/marketplace-operator-79b997595-8226q" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.273485 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/5e41d768-3ed4-4760-a0d5-4308d7b13379-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-s8mgz\" (UID: \"5e41d768-3ed4-4760-a0d5-4308d7b13379\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-s8mgz" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.273527 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/93dadffe-0353-4301-bb97-31b034d3dc64-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-t8c6x\" (UID: \"93dadffe-0353-4301-bb97-31b034d3dc64\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-t8c6x" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.273592 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/044562bd-df74-47fa-bc8d-1c652233e9c5-config\") pod \"authentication-operator-69f744f599-6r96v\" (UID: \"044562bd-df74-47fa-bc8d-1c652233e9c5\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-6r96v" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.273626 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/005900fa-b395-4c1c-8e62-8e975bd0393c-config\") pod \"openshift-apiserver-operator-796bbdcf4f-rqxss\" (UID: \"005900fa-b395-4c1c-8e62-8e975bd0393c\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-rqxss" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.273644 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/dced2102-9fd0-4300-9e0a-35d915f1caad-serving-cert\") pod \"apiserver-7bbb656c7d-zrq8d\" (UID: \"dced2102-9fd0-4300-9e0a-35d915f1caad\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zrq8d" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.273664 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fht74\" (UniqueName: \"kubernetes.io/projected/005900fa-b395-4c1c-8e62-8e975bd0393c-kube-api-access-fht74\") pod \"openshift-apiserver-operator-796bbdcf4f-rqxss\" (UID: \"005900fa-b395-4c1c-8e62-8e975bd0393c\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-rqxss" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.273709 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/29bcfe72-6ef1-4087-9feb-787fdba3d2d7-srv-cert\") pod \"catalog-operator-68c6474976-gtwmf\" (UID: \"29bcfe72-6ef1-4087-9feb-787fdba3d2d7\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-gtwmf" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.273736 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/59c840a8-f288-44ed-83d3-34d47041c6c6-serving-cert\") pod \"controller-manager-879f6c89f-tv2n7\" (UID: \"59c840a8-f288-44ed-83d3-34d47041c6c6\") " pod="openshift-controller-manager/controller-manager-879f6c89f-tv2n7" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.273765 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/dced2102-9fd0-4300-9e0a-35d915f1caad-audit-policies\") pod \"apiserver-7bbb656c7d-zrq8d\" (UID: \"dced2102-9fd0-4300-9e0a-35d915f1caad\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zrq8d" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.273782 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/93dadffe-0353-4301-bb97-31b034d3dc64-config\") pod \"kube-apiserver-operator-766d6c64bb-t8c6x\" (UID: \"93dadffe-0353-4301-bb97-31b034d3dc64\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-t8c6x" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.273810 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/c1674a73-a65c-4a8d-9dc5-af576a7af7d4-auth-proxy-config\") pod \"machine-approver-56656f9798-n7fkv\" (UID: \"c1674a73-a65c-4a8d-9dc5-af576a7af7d4\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-n7fkv" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.273826 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c1674a73-a65c-4a8d-9dc5-af576a7af7d4-config\") pod \"machine-approver-56656f9798-n7fkv\" (UID: \"c1674a73-a65c-4a8d-9dc5-af576a7af7d4\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-n7fkv" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.273845 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fvtdx\" (UniqueName: \"kubernetes.io/projected/1d5466ab-a589-4f7e-ae89-2f494b10f6b1-kube-api-access-fvtdx\") pod \"route-controller-manager-6576b87f9c-d9j8j\" (UID: \"1d5466ab-a589-4f7e-ae89-2f494b10f6b1\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-d9j8j" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.273876 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dwr4x\" (UniqueName: \"kubernetes.io/projected/064b42ee-720b-456c-8ffe-a247f827befc-kube-api-access-dwr4x\") pod \"packageserver-d55dfcdfc-zn6w7\" (UID: \"064b42ee-720b-456c-8ffe-a247f827befc\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-zn6w7" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.273899 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/5e41d768-3ed4-4760-a0d5-4308d7b13379-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-s8mgz\" (UID: \"5e41d768-3ed4-4760-a0d5-4308d7b13379\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-s8mgz" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.273982 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2tj47\" (UniqueName: \"kubernetes.io/projected/9fc59286-0388-4519-afc7-f2c8cf80ab40-kube-api-access-2tj47\") pod \"image-pruner-29560320-s9q72\" (UID: \"9fc59286-0388-4519-afc7-f2c8cf80ab40\") " pod="openshift-image-registry/image-pruner-29560320-s9q72" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.274000 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/681ca8e4-f909-4e8b-9f35-5ab8ca382e44-service-ca-bundle\") pod \"router-default-5444994796-gvk75\" (UID: \"681ca8e4-f909-4e8b-9f35-5ab8ca382e44\") " pod="openshift-ingress/router-default-5444994796-gvk75" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.274019 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bs9wq\" (UniqueName: \"kubernetes.io/projected/aa9721ae-1aaa-4d49-9f05-5aabb9ab31d9-kube-api-access-bs9wq\") pod \"kube-storage-version-migrator-operator-b67b599dd-ll5r8\" (UID: \"aa9721ae-1aaa-4d49-9f05-5aabb9ab31d9\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-ll5r8" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.273626 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1306b657-0022-435d-bb72-793f1c1a106b-config\") pod \"machine-api-operator-5694c8668f-9xv4p\" (UID: \"1306b657-0022-435d-bb72-793f1c1a106b\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-9xv4p" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.274125 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/044562bd-df74-47fa-bc8d-1c652233e9c5-config\") pod \"authentication-operator-69f744f599-6r96v\" (UID: \"044562bd-df74-47fa-bc8d-1c652233e9c5\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-6r96v" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.274165 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/dced2102-9fd0-4300-9e0a-35d915f1caad-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-zrq8d\" (UID: \"dced2102-9fd0-4300-9e0a-35d915f1caad\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zrq8d" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.274193 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/3eb76779-61d8-4977-8839-083fcf6cd69b-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-8tt9t\" (UID: \"3eb76779-61d8-4977-8839-083fcf6cd69b\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-8tt9t" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.274221 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aa9721ae-1aaa-4d49-9f05-5aabb9ab31d9-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-ll5r8\" (UID: \"aa9721ae-1aaa-4d49-9f05-5aabb9ab31d9\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-ll5r8" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.274506 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/005900fa-b395-4c1c-8e62-8e975bd0393c-config\") pod \"openshift-apiserver-operator-796bbdcf4f-rqxss\" (UID: \"005900fa-b395-4c1c-8e62-8e975bd0393c\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-rqxss" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.274847 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/dced2102-9fd0-4300-9e0a-35d915f1caad-audit-policies\") pod \"apiserver-7bbb656c7d-zrq8d\" (UID: \"dced2102-9fd0-4300-9e0a-35d915f1caad\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zrq8d" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.274984 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c1674a73-a65c-4a8d-9dc5-af576a7af7d4-config\") pod \"machine-approver-56656f9798-n7fkv\" (UID: \"c1674a73-a65c-4a8d-9dc5-af576a7af7d4\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-n7fkv" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.275150 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/c1674a73-a65c-4a8d-9dc5-af576a7af7d4-auth-proxy-config\") pod \"machine-approver-56656f9798-n7fkv\" (UID: \"c1674a73-a65c-4a8d-9dc5-af576a7af7d4\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-n7fkv" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.275293 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/dced2102-9fd0-4300-9e0a-35d915f1caad-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-zrq8d\" (UID: \"dced2102-9fd0-4300-9e0a-35d915f1caad\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zrq8d" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.275832 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-l648b"] Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.276055 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/5e41d768-3ed4-4760-a0d5-4308d7b13379-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-s8mgz\" (UID: \"5e41d768-3ed4-4760-a0d5-4308d7b13379\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-s8mgz" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.276816 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.277065 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1d5466ab-a589-4f7e-ae89-2f494b10f6b1-serving-cert\") pod \"route-controller-manager-6576b87f9c-d9j8j\" (UID: \"1d5466ab-a589-4f7e-ae89-2f494b10f6b1\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-d9j8j" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.277636 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-pruner-29560320-s9q72"] Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.277884 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/044562bd-df74-47fa-bc8d-1c652233e9c5-serving-cert\") pod \"authentication-operator-69f744f599-6r96v\" (UID: \"044562bd-df74-47fa-bc8d-1c652233e9c5\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-6r96v" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.278016 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/171b00f7-f7cf-41b3-bffd-11ceeb9f2182-metrics-tls\") pod \"dns-operator-744455d44c-fnmb9\" (UID: \"171b00f7-f7cf-41b3-bffd-11ceeb9f2182\") " pod="openshift-dns-operator/dns-operator-744455d44c-fnmb9" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.278320 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/c1674a73-a65c-4a8d-9dc5-af576a7af7d4-machine-approver-tls\") pod \"machine-approver-56656f9798-n7fkv\" (UID: \"c1674a73-a65c-4a8d-9dc5-af576a7af7d4\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-n7fkv" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.278385 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/b98dcc1e-4c4b-47eb-9ddf-59a138f94247-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-v8zx4\" (UID: \"b98dcc1e-4c4b-47eb-9ddf-59a138f94247\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-v8zx4" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.278642 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/32cac2d0-56f0-4ba8-86dc-9b57c0fcc11c-console-oauth-config\") pod \"console-f9d7485db-nnqsw\" (UID: \"32cac2d0-56f0-4ba8-86dc-9b57c0fcc11c\") " pod="openshift-console/console-f9d7485db-nnqsw" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.280637 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/dced2102-9fd0-4300-9e0a-35d915f1caad-serving-cert\") pod \"apiserver-7bbb656c7d-zrq8d\" (UID: \"dced2102-9fd0-4300-9e0a-35d915f1caad\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zrq8d" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.281806 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-8tt9t"] Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.282860 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29560330-44pts"] Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.283960 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-nsxl4"] Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.285029 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-mwhpz"] Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.286070 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-trp9l"] Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.289762 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-d4vrm"] Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.289796 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-mplx7"] Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.289811 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-zn6w7"] Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.289903 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-trp9l" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.290269 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/1306b657-0022-435d-bb72-793f1c1a106b-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-9xv4p\" (UID: \"1306b657-0022-435d-bb72-793f1c1a106b\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-9xv4p" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.290784 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-6nkm6"] Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.292484 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-8226q"] Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.293223 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-vvdz2"] Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.299609 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.302178 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29560320-4hk5d"] Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.303476 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-trp9l"] Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.305057 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-2k6jt"] Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.305589 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-fwkzt"] Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.307668 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-fk9l7"] Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.311534 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-npvts"] Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.313192 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-npvts"] Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.313325 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-npvts" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.316466 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.334441 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-server-vkr88"] Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.335066 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-vkr88" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.337320 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.357905 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.375470 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/59c840a8-f288-44ed-83d3-34d47041c6c6-config\") pod \"controller-manager-879f6c89f-tv2n7\" (UID: \"59c840a8-f288-44ed-83d3-34d47041c6c6\") " pod="openshift-controller-manager/controller-manager-879f6c89f-tv2n7" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.375669 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/db720c64-b1fa-48c9-a4b7-fc42f8ca47fd-signing-cabundle\") pod \"service-ca-9c57cc56f-6nkm6\" (UID: \"db720c64-b1fa-48c9-a4b7-fc42f8ca47fd\") " pod="openshift-service-ca/service-ca-9c57cc56f-6nkm6" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.375778 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dq52v\" (UniqueName: \"kubernetes.io/projected/55e76e8f-7d69-4f55-81f8-45c9c612876b-kube-api-access-dq52v\") pod \"auto-csr-approver-29560330-44pts\" (UID: \"55e76e8f-7d69-4f55-81f8-45c9c612876b\") " pod="openshift-infra/auto-csr-approver-29560330-44pts" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.375870 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/29bcfe72-6ef1-4087-9feb-787fdba3d2d7-profile-collector-cert\") pod \"catalog-operator-68c6474976-gtwmf\" (UID: \"29bcfe72-6ef1-4087-9feb-787fdba3d2d7\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-gtwmf" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.375947 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ct69q\" (UniqueName: \"kubernetes.io/projected/7442ef1b-27ea-4166-8457-5332c4c8f363-kube-api-access-ct69q\") pod \"multus-admission-controller-857f4d67dd-mplx7\" (UID: \"7442ef1b-27ea-4166-8457-5332c4c8f363\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-mplx7" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.376039 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/681ca8e4-f909-4e8b-9f35-5ab8ca382e44-stats-auth\") pod \"router-default-5444994796-gvk75\" (UID: \"681ca8e4-f909-4e8b-9f35-5ab8ca382e44\") " pod="openshift-ingress/router-default-5444994796-gvk75" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.376114 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/069c0b04-3302-488c-84fc-eeccac5fae9b-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-fk9l7\" (UID: \"069c0b04-3302-488c-84fc-eeccac5fae9b\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-fk9l7" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.376191 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/79ec9746-96c0-4fcd-b367-a42b6950145a-serving-cert\") pod \"service-ca-operator-777779d784-vvdz2\" (UID: \"79ec9746-96c0-4fcd-b367-a42b6950145a\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-vvdz2" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.376287 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/064b42ee-720b-456c-8ffe-a247f827befc-tmpfs\") pod \"packageserver-d55dfcdfc-zn6w7\" (UID: \"064b42ee-720b-456c-8ffe-a247f827befc\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-zn6w7" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.376405 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/93dadffe-0353-4301-bb97-31b034d3dc64-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-t8c6x\" (UID: \"93dadffe-0353-4301-bb97-31b034d3dc64\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-t8c6x" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.376504 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/a1bc4b9a-e741-4f63-81e8-fdce3da0a5ea-srv-cert\") pod \"olm-operator-6b444d44fb-nsxl4\" (UID: \"a1bc4b9a-e741-4f63-81e8-fdce3da0a5ea\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-nsxl4" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.376636 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/f4bdfe91-48ef-4a44-99ba-d7ab90df9ec0-auth-proxy-config\") pod \"machine-config-operator-74547568cd-4t6gm\" (UID: \"f4bdfe91-48ef-4a44-99ba-d7ab90df9ec0\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-4t6gm" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.376752 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/59c840a8-f288-44ed-83d3-34d47041c6c6-client-ca\") pod \"controller-manager-879f6c89f-tv2n7\" (UID: \"59c840a8-f288-44ed-83d3-34d47041c6c6\") " pod="openshift-controller-manager/controller-manager-879f6c89f-tv2n7" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.376860 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3eb76779-61d8-4977-8839-083fcf6cd69b-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-8tt9t\" (UID: \"3eb76779-61d8-4977-8839-083fcf6cd69b\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-8tt9t" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.376980 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n5mpc\" (UniqueName: \"kubernetes.io/projected/681ca8e4-f909-4e8b-9f35-5ab8ca382e44-kube-api-access-n5mpc\") pod \"router-default-5444994796-gvk75\" (UID: \"681ca8e4-f909-4e8b-9f35-5ab8ca382e44\") " pod="openshift-ingress/router-default-5444994796-gvk75" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.377065 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/064b42ee-720b-456c-8ffe-a247f827befc-tmpfs\") pod \"packageserver-d55dfcdfc-zn6w7\" (UID: \"064b42ee-720b-456c-8ffe-a247f827befc\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-zn6w7" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.376987 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/59c840a8-f288-44ed-83d3-34d47041c6c6-config\") pod \"controller-manager-879f6c89f-tv2n7\" (UID: \"59c840a8-f288-44ed-83d3-34d47041c6c6\") " pod="openshift-controller-manager/controller-manager-879f6c89f-tv2n7" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.377018 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.377445 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/f4bdfe91-48ef-4a44-99ba-d7ab90df9ec0-auth-proxy-config\") pod \"machine-config-operator-74547568cd-4t6gm\" (UID: \"f4bdfe91-48ef-4a44-99ba-d7ab90df9ec0\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-4t6gm" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.377567 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/59c840a8-f288-44ed-83d3-34d47041c6c6-client-ca\") pod \"controller-manager-879f6c89f-tv2n7\" (UID: \"59c840a8-f288-44ed-83d3-34d47041c6c6\") " pod="openshift-controller-manager/controller-manager-879f6c89f-tv2n7" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.377576 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4wvzp\" (UniqueName: \"kubernetes.io/projected/34f93b2b-cc36-4965-992c-825bf2595e1e-kube-api-access-4wvzp\") pod \"machine-config-controller-84d6567774-tgbjg\" (UID: \"34f93b2b-cc36-4965-992c-825bf2595e1e\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-tgbjg" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.377882 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9vkgk\" (UniqueName: \"kubernetes.io/projected/0ec3cdc0-f024-43cf-b520-7d2437e0f8df-kube-api-access-9vkgk\") pod \"downloads-7954f5f757-5rr7c\" (UID: \"0ec3cdc0-f024-43cf-b520-7d2437e0f8df\") " pod="openshift-console/downloads-7954f5f757-5rr7c" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.378012 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/069c0b04-3302-488c-84fc-eeccac5fae9b-config\") pod \"kube-controller-manager-operator-78b949d7b-fk9l7\" (UID: \"069c0b04-3302-488c-84fc-eeccac5fae9b\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-fk9l7" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.378130 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/f4bdfe91-48ef-4a44-99ba-d7ab90df9ec0-proxy-tls\") pod \"machine-config-operator-74547568cd-4t6gm\" (UID: \"f4bdfe91-48ef-4a44-99ba-d7ab90df9ec0\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-4t6gm" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.378254 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/9397185e-a9e3-4ef4-b0be-d9dc9208adff-secret-volume\") pod \"collect-profiles-29560320-4hk5d\" (UID: \"9397185e-a9e3-4ef4-b0be-d9dc9208adff\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29560320-4hk5d" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.378363 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/069c0b04-3302-488c-84fc-eeccac5fae9b-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-fk9l7\" (UID: \"069c0b04-3302-488c-84fc-eeccac5fae9b\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-fk9l7" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.378469 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hxxlh\" (UniqueName: \"kubernetes.io/projected/f4bdfe91-48ef-4a44-99ba-d7ab90df9ec0-kube-api-access-hxxlh\") pod \"machine-config-operator-74547568cd-4t6gm\" (UID: \"f4bdfe91-48ef-4a44-99ba-d7ab90df9ec0\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-4t6gm" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.378596 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/aa9721ae-1aaa-4d49-9f05-5aabb9ab31d9-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-ll5r8\" (UID: \"aa9721ae-1aaa-4d49-9f05-5aabb9ab31d9\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-ll5r8" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.378710 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-86fb4\" (UniqueName: \"kubernetes.io/projected/4cc341f9-55c7-4bce-a0e3-24df68ca7f0e-kube-api-access-86fb4\") pod \"console-operator-58897d9998-q9xc9\" (UID: \"4cc341f9-55c7-4bce-a0e3-24df68ca7f0e\") " pod="openshift-console-operator/console-operator-58897d9998-q9xc9" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.379016 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/59c840a8-f288-44ed-83d3-34d47041c6c6-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-tv2n7\" (UID: \"59c840a8-f288-44ed-83d3-34d47041c6c6\") " pod="openshift-controller-manager/controller-manager-879f6c89f-tv2n7" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.379161 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zsbsw\" (UniqueName: \"kubernetes.io/projected/59c840a8-f288-44ed-83d3-34d47041c6c6-kube-api-access-zsbsw\") pod \"controller-manager-879f6c89f-tv2n7\" (UID: \"59c840a8-f288-44ed-83d3-34d47041c6c6\") " pod="openshift-controller-manager/controller-manager-879f6c89f-tv2n7" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.379326 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/db720c64-b1fa-48c9-a4b7-fc42f8ca47fd-signing-key\") pod \"service-ca-9c57cc56f-6nkm6\" (UID: \"db720c64-b1fa-48c9-a4b7-fc42f8ca47fd\") " pod="openshift-service-ca/service-ca-9c57cc56f-6nkm6" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.379613 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xxqlb\" (UniqueName: \"kubernetes.io/projected/29bcfe72-6ef1-4087-9feb-787fdba3d2d7-kube-api-access-xxqlb\") pod \"catalog-operator-68c6474976-gtwmf\" (UID: \"29bcfe72-6ef1-4087-9feb-787fdba3d2d7\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-gtwmf" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.379784 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/a1bc4b9a-e741-4f63-81e8-fdce3da0a5ea-profile-collector-cert\") pod \"olm-operator-6b444d44fb-nsxl4\" (UID: \"a1bc4b9a-e741-4f63-81e8-fdce3da0a5ea\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-nsxl4" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.379923 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/9397185e-a9e3-4ef4-b0be-d9dc9208adff-config-volume\") pod \"collect-profiles-29560320-4hk5d\" (UID: \"9397185e-a9e3-4ef4-b0be-d9dc9208adff\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29560320-4hk5d" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.380073 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/ef3a7303-57a8-461f-86c1-fd3f7882e93b-bound-sa-token\") pod \"ingress-operator-5b745b69d9-l648b\" (UID: \"ef3a7303-57a8-461f-86c1-fd3f7882e93b\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-l648b" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.380192 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0a98bca9-38d3-4382-a6d6-8410170f7d81-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-sh4ps\" (UID: \"0a98bca9-38d3-4382-a6d6-8410170f7d81\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-sh4ps" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.380297 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fskkz\" (UniqueName: \"kubernetes.io/projected/ef3a7303-57a8-461f-86c1-fd3f7882e93b-kube-api-access-fskkz\") pod \"ingress-operator-5b745b69d9-l648b\" (UID: \"ef3a7303-57a8-461f-86c1-fd3f7882e93b\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-l648b" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.380424 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nfjfn\" (UniqueName: \"kubernetes.io/projected/4f90d894-17c6-4800-a438-737fe8619e01-kube-api-access-nfjfn\") pod \"openshift-config-operator-7777fb866f-d4vrm\" (UID: \"4f90d894-17c6-4800-a438-737fe8619e01\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-d4vrm" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.380593 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0a98bca9-38d3-4382-a6d6-8410170f7d81-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-sh4ps\" (UID: \"0a98bca9-38d3-4382-a6d6-8410170f7d81\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-sh4ps" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.380721 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/9e737c04-a2db-452e-adc7-fa383e158b53-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-jm8db\" (UID: \"9e737c04-a2db-452e-adc7-fa383e158b53\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-jm8db" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.380836 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/34f93b2b-cc36-4965-992c-825bf2595e1e-proxy-tls\") pod \"machine-config-controller-84d6567774-tgbjg\" (UID: \"34f93b2b-cc36-4965-992c-825bf2595e1e\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-tgbjg" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.380957 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4cc341f9-55c7-4bce-a0e3-24df68ca7f0e-serving-cert\") pod \"console-operator-58897d9998-q9xc9\" (UID: \"4cc341f9-55c7-4bce-a0e3-24df68ca7f0e\") " pod="openshift-console-operator/console-operator-58897d9998-q9xc9" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.381066 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/02854230-6165-4f22-8780-d8591b991132-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-8226q\" (UID: \"02854230-6165-4f22-8780-d8591b991132\") " pod="openshift-marketplace/marketplace-operator-79b997595-8226q" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.381175 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/064b42ee-720b-456c-8ffe-a247f827befc-webhook-cert\") pod \"packageserver-d55dfcdfc-zn6w7\" (UID: \"064b42ee-720b-456c-8ffe-a247f827befc\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-zn6w7" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.380865 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0a98bca9-38d3-4382-a6d6-8410170f7d81-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-sh4ps\" (UID: \"0a98bca9-38d3-4382-a6d6-8410170f7d81\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-sh4ps" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.381287 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pdffs\" (UniqueName: \"kubernetes.io/projected/9397185e-a9e3-4ef4-b0be-d9dc9208adff-kube-api-access-pdffs\") pod \"collect-profiles-29560320-4hk5d\" (UID: \"9397185e-a9e3-4ef4-b0be-d9dc9208adff\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29560320-4hk5d" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.381357 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4cc341f9-55c7-4bce-a0e3-24df68ca7f0e-config\") pod \"console-operator-58897d9998-q9xc9\" (UID: \"4cc341f9-55c7-4bce-a0e3-24df68ca7f0e\") " pod="openshift-console-operator/console-operator-58897d9998-q9xc9" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.381386 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/02854230-6165-4f22-8780-d8591b991132-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-8226q\" (UID: \"02854230-6165-4f22-8780-d8591b991132\") " pod="openshift-marketplace/marketplace-operator-79b997595-8226q" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.381432 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/93dadffe-0353-4301-bb97-31b034d3dc64-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-t8c6x\" (UID: \"93dadffe-0353-4301-bb97-31b034d3dc64\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-t8c6x" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.381482 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/29bcfe72-6ef1-4087-9feb-787fdba3d2d7-srv-cert\") pod \"catalog-operator-68c6474976-gtwmf\" (UID: \"29bcfe72-6ef1-4087-9feb-787fdba3d2d7\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-gtwmf" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.381500 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/59c840a8-f288-44ed-83d3-34d47041c6c6-serving-cert\") pod \"controller-manager-879f6c89f-tv2n7\" (UID: \"59c840a8-f288-44ed-83d3-34d47041c6c6\") " pod="openshift-controller-manager/controller-manager-879f6c89f-tv2n7" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.381580 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dwr4x\" (UniqueName: \"kubernetes.io/projected/064b42ee-720b-456c-8ffe-a247f827befc-kube-api-access-dwr4x\") pod \"packageserver-d55dfcdfc-zn6w7\" (UID: \"064b42ee-720b-456c-8ffe-a247f827befc\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-zn6w7" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.381605 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2tj47\" (UniqueName: \"kubernetes.io/projected/9fc59286-0388-4519-afc7-f2c8cf80ab40-kube-api-access-2tj47\") pod \"image-pruner-29560320-s9q72\" (UID: \"9fc59286-0388-4519-afc7-f2c8cf80ab40\") " pod="openshift-image-registry/image-pruner-29560320-s9q72" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.381623 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/681ca8e4-f909-4e8b-9f35-5ab8ca382e44-service-ca-bundle\") pod \"router-default-5444994796-gvk75\" (UID: \"681ca8e4-f909-4e8b-9f35-5ab8ca382e44\") " pod="openshift-ingress/router-default-5444994796-gvk75" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.381641 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/93dadffe-0353-4301-bb97-31b034d3dc64-config\") pod \"kube-apiserver-operator-766d6c64bb-t8c6x\" (UID: \"93dadffe-0353-4301-bb97-31b034d3dc64\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-t8c6x" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.381671 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/3eb76779-61d8-4977-8839-083fcf6cd69b-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-8tt9t\" (UID: \"3eb76779-61d8-4977-8839-083fcf6cd69b\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-8tt9t" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.381696 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aa9721ae-1aaa-4d49-9f05-5aabb9ab31d9-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-ll5r8\" (UID: \"aa9721ae-1aaa-4d49-9f05-5aabb9ab31d9\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-ll5r8" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.381724 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bs9wq\" (UniqueName: \"kubernetes.io/projected/aa9721ae-1aaa-4d49-9f05-5aabb9ab31d9-kube-api-access-bs9wq\") pod \"kube-storage-version-migrator-operator-b67b599dd-ll5r8\" (UID: \"aa9721ae-1aaa-4d49-9f05-5aabb9ab31d9\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-ll5r8" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.381757 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/f4bdfe91-48ef-4a44-99ba-d7ab90df9ec0-images\") pod \"machine-config-operator-74547568cd-4t6gm\" (UID: \"f4bdfe91-48ef-4a44-99ba-d7ab90df9ec0\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-4t6gm" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.381793 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/7442ef1b-27ea-4166-8457-5332c4c8f363-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-mplx7\" (UID: \"7442ef1b-27ea-4166-8457-5332c4c8f363\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-mplx7" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.381811 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/9fc59286-0388-4519-afc7-f2c8cf80ab40-serviceca\") pod \"image-pruner-29560320-s9q72\" (UID: \"9fc59286-0388-4519-afc7-f2c8cf80ab40\") " pod="openshift-image-registry/image-pruner-29560320-s9q72" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.381837 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-spn2v\" (UniqueName: \"kubernetes.io/projected/dc6dfded-ec9e-4a6f-a97b-3bb4cf8a149a-kube-api-access-spn2v\") pod \"control-plane-machine-set-operator-78cbb6b69f-mwhpz\" (UID: \"dc6dfded-ec9e-4a6f-a97b-3bb4cf8a149a\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-mwhpz" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.381856 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/681ca8e4-f909-4e8b-9f35-5ab8ca382e44-metrics-certs\") pod \"router-default-5444994796-gvk75\" (UID: \"681ca8e4-f909-4e8b-9f35-5ab8ca382e44\") " pod="openshift-ingress/router-default-5444994796-gvk75" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.381875 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wwn7z\" (UniqueName: \"kubernetes.io/projected/79ec9746-96c0-4fcd-b367-a42b6950145a-kube-api-access-wwn7z\") pod \"service-ca-operator-777779d784-vvdz2\" (UID: \"79ec9746-96c0-4fcd-b367-a42b6950145a\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-vvdz2" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.381913 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4f90d894-17c6-4800-a438-737fe8619e01-serving-cert\") pod \"openshift-config-operator-7777fb866f-d4vrm\" (UID: \"4f90d894-17c6-4800-a438-737fe8619e01\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-d4vrm" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.381939 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pg5rd\" (UniqueName: \"kubernetes.io/projected/0a98bca9-38d3-4382-a6d6-8410170f7d81-kube-api-access-pg5rd\") pod \"openshift-controller-manager-operator-756b6f6bc6-sh4ps\" (UID: \"0a98bca9-38d3-4382-a6d6-8410170f7d81\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-sh4ps" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.381959 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/681ca8e4-f909-4e8b-9f35-5ab8ca382e44-default-certificate\") pod \"router-default-5444994796-gvk75\" (UID: \"681ca8e4-f909-4e8b-9f35-5ab8ca382e44\") " pod="openshift-ingress/router-default-5444994796-gvk75" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.381984 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wgzct\" (UniqueName: \"kubernetes.io/projected/9e737c04-a2db-452e-adc7-fa383e158b53-kube-api-access-wgzct\") pod \"package-server-manager-789f6589d5-jm8db\" (UID: \"9e737c04-a2db-452e-adc7-fa383e158b53\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-jm8db" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.382004 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3eb76779-61d8-4977-8839-083fcf6cd69b-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-8tt9t\" (UID: \"3eb76779-61d8-4977-8839-083fcf6cd69b\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-8tt9t" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.382026 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/79ec9746-96c0-4fcd-b367-a42b6950145a-config\") pod \"service-ca-operator-777779d784-vvdz2\" (UID: \"79ec9746-96c0-4fcd-b367-a42b6950145a\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-vvdz2" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.382046 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ef3a7303-57a8-461f-86c1-fd3f7882e93b-trusted-ca\") pod \"ingress-operator-5b745b69d9-l648b\" (UID: \"ef3a7303-57a8-461f-86c1-fd3f7882e93b\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-l648b" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.382063 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j62nk\" (UniqueName: \"kubernetes.io/projected/0386f821-c5fb-4dfd-acaf-706e214a57c0-kube-api-access-j62nk\") pod \"migrator-59844c95c7-fwkzt\" (UID: \"0386f821-c5fb-4dfd-acaf-706e214a57c0\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-fwkzt" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.382207 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/59c840a8-f288-44ed-83d3-34d47041c6c6-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-tv2n7\" (UID: \"59c840a8-f288-44ed-83d3-34d47041c6c6\") " pod="openshift-controller-manager/controller-manager-879f6c89f-tv2n7" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.382346 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4cc341f9-55c7-4bce-a0e3-24df68ca7f0e-config\") pod \"console-operator-58897d9998-q9xc9\" (UID: \"4cc341f9-55c7-4bce-a0e3-24df68ca7f0e\") " pod="openshift-console-operator/console-operator-58897d9998-q9xc9" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.382903 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/dc6dfded-ec9e-4a6f-a97b-3bb4cf8a149a-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-mwhpz\" (UID: \"dc6dfded-ec9e-4a6f-a97b-3bb4cf8a149a\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-mwhpz" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.382936 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/ef3a7303-57a8-461f-86c1-fd3f7882e93b-metrics-tls\") pod \"ingress-operator-5b745b69d9-l648b\" (UID: \"ef3a7303-57a8-461f-86c1-fd3f7882e93b\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-l648b" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.382958 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rclzn\" (UniqueName: \"kubernetes.io/projected/a1bc4b9a-e741-4f63-81e8-fdce3da0a5ea-kube-api-access-rclzn\") pod \"olm-operator-6b444d44fb-nsxl4\" (UID: \"a1bc4b9a-e741-4f63-81e8-fdce3da0a5ea\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-nsxl4" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.383040 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zgh99\" (UniqueName: \"kubernetes.io/projected/02854230-6165-4f22-8780-d8591b991132-kube-api-access-zgh99\") pod \"marketplace-operator-79b997595-8226q\" (UID: \"02854230-6165-4f22-8780-d8591b991132\") " pod="openshift-marketplace/marketplace-operator-79b997595-8226q" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.383119 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/34f93b2b-cc36-4965-992c-825bf2595e1e-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-tgbjg\" (UID: \"34f93b2b-cc36-4965-992c-825bf2595e1e\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-tgbjg" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.383189 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pkph5\" (UniqueName: \"kubernetes.io/projected/db720c64-b1fa-48c9-a4b7-fc42f8ca47fd-kube-api-access-pkph5\") pod \"service-ca-9c57cc56f-6nkm6\" (UID: \"db720c64-b1fa-48c9-a4b7-fc42f8ca47fd\") " pod="openshift-service-ca/service-ca-9c57cc56f-6nkm6" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.383209 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/064b42ee-720b-456c-8ffe-a247f827befc-apiservice-cert\") pod \"packageserver-d55dfcdfc-zn6w7\" (UID: \"064b42ee-720b-456c-8ffe-a247f827befc\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-zn6w7" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.383251 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4cc341f9-55c7-4bce-a0e3-24df68ca7f0e-trusted-ca\") pod \"console-operator-58897d9998-q9xc9\" (UID: \"4cc341f9-55c7-4bce-a0e3-24df68ca7f0e\") " pod="openshift-console-operator/console-operator-58897d9998-q9xc9" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.383297 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/4f90d894-17c6-4800-a438-737fe8619e01-available-featuregates\") pod \"openshift-config-operator-7777fb866f-d4vrm\" (UID: \"4f90d894-17c6-4800-a438-737fe8619e01\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-d4vrm" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.383800 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0a98bca9-38d3-4382-a6d6-8410170f7d81-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-sh4ps\" (UID: \"0a98bca9-38d3-4382-a6d6-8410170f7d81\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-sh4ps" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.383835 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/4f90d894-17c6-4800-a438-737fe8619e01-available-featuregates\") pod \"openshift-config-operator-7777fb866f-d4vrm\" (UID: \"4f90d894-17c6-4800-a438-737fe8619e01\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-d4vrm" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.384846 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4cc341f9-55c7-4bce-a0e3-24df68ca7f0e-serving-cert\") pod \"console-operator-58897d9998-q9xc9\" (UID: \"4cc341f9-55c7-4bce-a0e3-24df68ca7f0e\") " pod="openshift-console-operator/console-operator-58897d9998-q9xc9" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.385513 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/9fc59286-0388-4519-afc7-f2c8cf80ab40-serviceca\") pod \"image-pruner-29560320-s9q72\" (UID: \"9fc59286-0388-4519-afc7-f2c8cf80ab40\") " pod="openshift-image-registry/image-pruner-29560320-s9q72" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.385583 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/34f93b2b-cc36-4965-992c-825bf2595e1e-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-tgbjg\" (UID: \"34f93b2b-cc36-4965-992c-825bf2595e1e\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-tgbjg" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.385729 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4f90d894-17c6-4800-a438-737fe8619e01-serving-cert\") pod \"openshift-config-operator-7777fb866f-d4vrm\" (UID: \"4f90d894-17c6-4800-a438-737fe8619e01\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-d4vrm" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.389558 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4cc341f9-55c7-4bce-a0e3-24df68ca7f0e-trusted-ca\") pod \"console-operator-58897d9998-q9xc9\" (UID: \"4cc341f9-55c7-4bce-a0e3-24df68ca7f0e\") " pod="openshift-console-operator/console-operator-58897d9998-q9xc9" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.390610 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/59c840a8-f288-44ed-83d3-34d47041c6c6-serving-cert\") pod \"controller-manager-879f6c89f-tv2n7\" (UID: \"59c840a8-f288-44ed-83d3-34d47041c6c6\") " pod="openshift-controller-manager/controller-manager-879f6c89f-tv2n7" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.397406 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.421849 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.424596 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ef3a7303-57a8-461f-86c1-fd3f7882e93b-trusted-ca\") pod \"ingress-operator-5b745b69d9-l648b\" (UID: \"ef3a7303-57a8-461f-86c1-fd3f7882e93b\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-l648b" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.437490 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.457111 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.477128 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.488664 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/ef3a7303-57a8-461f-86c1-fd3f7882e93b-metrics-tls\") pod \"ingress-operator-5b745b69d9-l648b\" (UID: \"ef3a7303-57a8-461f-86c1-fd3f7882e93b\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-l648b" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.497285 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.517383 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.537375 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.560010 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.578504 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.588237 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/dc6dfded-ec9e-4a6f-a97b-3bb4cf8a149a-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-mwhpz\" (UID: \"dc6dfded-ec9e-4a6f-a97b-3bb4cf8a149a\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-mwhpz" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.597343 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.618007 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.637826 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.641795 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/9397185e-a9e3-4ef4-b0be-d9dc9208adff-secret-volume\") pod \"collect-profiles-29560320-4hk5d\" (UID: \"9397185e-a9e3-4ef4-b0be-d9dc9208adff\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29560320-4hk5d" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.643257 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/a1bc4b9a-e741-4f63-81e8-fdce3da0a5ea-profile-collector-cert\") pod \"olm-operator-6b444d44fb-nsxl4\" (UID: \"a1bc4b9a-e741-4f63-81e8-fdce3da0a5ea\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-nsxl4" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.648504 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/29bcfe72-6ef1-4087-9feb-787fdba3d2d7-profile-collector-cert\") pod \"catalog-operator-68c6474976-gtwmf\" (UID: \"29bcfe72-6ef1-4087-9feb-787fdba3d2d7\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-gtwmf" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.656379 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.664003 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/29bcfe72-6ef1-4087-9feb-787fdba3d2d7-srv-cert\") pod \"catalog-operator-68c6474976-gtwmf\" (UID: \"29bcfe72-6ef1-4087-9feb-787fdba3d2d7\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-gtwmf" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.677126 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.698586 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.717947 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.724694 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/34f93b2b-cc36-4965-992c-825bf2595e1e-proxy-tls\") pod \"machine-config-controller-84d6567774-tgbjg\" (UID: \"34f93b2b-cc36-4965-992c-825bf2595e1e\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-tgbjg" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.737654 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.757441 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.777190 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.796927 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.804485 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/93dadffe-0353-4301-bb97-31b034d3dc64-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-t8c6x\" (UID: \"93dadffe-0353-4301-bb97-31b034d3dc64\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-t8c6x" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.817842 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.825401 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/93dadffe-0353-4301-bb97-31b034d3dc64-config\") pod \"kube-apiserver-operator-766d6c64bb-t8c6x\" (UID: \"93dadffe-0353-4301-bb97-31b034d3dc64\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-t8c6x" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.836748 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.843695 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/f4bdfe91-48ef-4a44-99ba-d7ab90df9ec0-images\") pod \"machine-config-operator-74547568cd-4t6gm\" (UID: \"f4bdfe91-48ef-4a44-99ba-d7ab90df9ec0\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-4t6gm" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.856526 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.877540 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.880938 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/f4bdfe91-48ef-4a44-99ba-d7ab90df9ec0-proxy-tls\") pod \"machine-config-operator-74547568cd-4t6gm\" (UID: \"f4bdfe91-48ef-4a44-99ba-d7ab90df9ec0\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-4t6gm" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.897378 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.910254 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/a1bc4b9a-e741-4f63-81e8-fdce3da0a5ea-srv-cert\") pod \"olm-operator-6b444d44fb-nsxl4\" (UID: \"a1bc4b9a-e741-4f63-81e8-fdce3da0a5ea\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-nsxl4" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.916945 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.936642 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.956431 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.967580 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/681ca8e4-f909-4e8b-9f35-5ab8ca382e44-default-certificate\") pod \"router-default-5444994796-gvk75\" (UID: \"681ca8e4-f909-4e8b-9f35-5ab8ca382e44\") " pod="openshift-ingress/router-default-5444994796-gvk75" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.977019 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.989079 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/681ca8e4-f909-4e8b-9f35-5ab8ca382e44-stats-auth\") pod \"router-default-5444994796-gvk75\" (UID: \"681ca8e4-f909-4e8b-9f35-5ab8ca382e44\") " pod="openshift-ingress/router-default-5444994796-gvk75" Mar 16 00:10:32 crc kubenswrapper[4816]: I0316 00:10:32.996412 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Mar 16 00:10:33 crc kubenswrapper[4816]: I0316 00:10:33.007031 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/681ca8e4-f909-4e8b-9f35-5ab8ca382e44-metrics-certs\") pod \"router-default-5444994796-gvk75\" (UID: \"681ca8e4-f909-4e8b-9f35-5ab8ca382e44\") " pod="openshift-ingress/router-default-5444994796-gvk75" Mar 16 00:10:33 crc kubenswrapper[4816]: I0316 00:10:33.017681 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Mar 16 00:10:33 crc kubenswrapper[4816]: I0316 00:10:33.022705 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/681ca8e4-f909-4e8b-9f35-5ab8ca382e44-service-ca-bundle\") pod \"router-default-5444994796-gvk75\" (UID: \"681ca8e4-f909-4e8b-9f35-5ab8ca382e44\") " pod="openshift-ingress/router-default-5444994796-gvk75" Mar 16 00:10:33 crc kubenswrapper[4816]: I0316 00:10:33.037083 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Mar 16 00:10:33 crc kubenswrapper[4816]: I0316 00:10:33.056255 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Mar 16 00:10:33 crc kubenswrapper[4816]: I0316 00:10:33.064489 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/9e737c04-a2db-452e-adc7-fa383e158b53-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-jm8db\" (UID: \"9e737c04-a2db-452e-adc7-fa383e158b53\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-jm8db" Mar 16 00:10:33 crc kubenswrapper[4816]: I0316 00:10:33.076862 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Mar 16 00:10:33 crc kubenswrapper[4816]: I0316 00:10:33.082213 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/aa9721ae-1aaa-4d49-9f05-5aabb9ab31d9-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-ll5r8\" (UID: \"aa9721ae-1aaa-4d49-9f05-5aabb9ab31d9\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-ll5r8" Mar 16 00:10:33 crc kubenswrapper[4816]: I0316 00:10:33.095622 4816 request.go:700] Waited for 1.013848791s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-storage-version-migrator-operator/configmaps?fieldSelector=metadata.name%3Dopenshift-service-ca.crt&limit=500&resourceVersion=0 Mar 16 00:10:33 crc kubenswrapper[4816]: I0316 00:10:33.097208 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Mar 16 00:10:33 crc kubenswrapper[4816]: I0316 00:10:33.116933 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Mar 16 00:10:33 crc kubenswrapper[4816]: I0316 00:10:33.122925 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aa9721ae-1aaa-4d49-9f05-5aabb9ab31d9-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-ll5r8\" (UID: \"aa9721ae-1aaa-4d49-9f05-5aabb9ab31d9\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-ll5r8" Mar 16 00:10:33 crc kubenswrapper[4816]: I0316 00:10:33.136886 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Mar 16 00:10:33 crc kubenswrapper[4816]: I0316 00:10:33.156393 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Mar 16 00:10:33 crc kubenswrapper[4816]: I0316 00:10:33.177811 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 16 00:10:33 crc kubenswrapper[4816]: I0316 00:10:33.181959 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/9397185e-a9e3-4ef4-b0be-d9dc9208adff-config-volume\") pod \"collect-profiles-29560320-4hk5d\" (UID: \"9397185e-a9e3-4ef4-b0be-d9dc9208adff\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29560320-4hk5d" Mar 16 00:10:33 crc kubenswrapper[4816]: I0316 00:10:33.197287 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 16 00:10:33 crc kubenswrapper[4816]: I0316 00:10:33.217698 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Mar 16 00:10:33 crc kubenswrapper[4816]: I0316 00:10:33.236922 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Mar 16 00:10:33 crc kubenswrapper[4816]: I0316 00:10:33.257133 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Mar 16 00:10:33 crc kubenswrapper[4816]: I0316 00:10:33.297766 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Mar 16 00:10:33 crc kubenswrapper[4816]: I0316 00:10:33.304739 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/064b42ee-720b-456c-8ffe-a247f827befc-webhook-cert\") pod \"packageserver-d55dfcdfc-zn6w7\" (UID: \"064b42ee-720b-456c-8ffe-a247f827befc\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-zn6w7" Mar 16 00:10:33 crc kubenswrapper[4816]: I0316 00:10:33.307342 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/064b42ee-720b-456c-8ffe-a247f827befc-apiservice-cert\") pod \"packageserver-d55dfcdfc-zn6w7\" (UID: \"064b42ee-720b-456c-8ffe-a247f827befc\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-zn6w7" Mar 16 00:10:33 crc kubenswrapper[4816]: I0316 00:10:33.319688 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Mar 16 00:10:33 crc kubenswrapper[4816]: I0316 00:10:33.337262 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Mar 16 00:10:33 crc kubenswrapper[4816]: I0316 00:10:33.347903 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3eb76779-61d8-4977-8839-083fcf6cd69b-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-8tt9t\" (UID: \"3eb76779-61d8-4977-8839-083fcf6cd69b\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-8tt9t" Mar 16 00:10:33 crc kubenswrapper[4816]: I0316 00:10:33.357795 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Mar 16 00:10:33 crc kubenswrapper[4816]: E0316 00:10:33.376003 4816 configmap.go:193] Couldn't get configMap openshift-service-ca/signing-cabundle: failed to sync configmap cache: timed out waiting for the condition Mar 16 00:10:33 crc kubenswrapper[4816]: E0316 00:10:33.376113 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/db720c64-b1fa-48c9-a4b7-fc42f8ca47fd-signing-cabundle podName:db720c64-b1fa-48c9-a4b7-fc42f8ca47fd nodeName:}" failed. No retries permitted until 2026-03-16 00:10:33.876083207 +0000 UTC m=+226.972383200 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "signing-cabundle" (UniqueName: "kubernetes.io/configmap/db720c64-b1fa-48c9-a4b7-fc42f8ca47fd-signing-cabundle") pod "service-ca-9c57cc56f-6nkm6" (UID: "db720c64-b1fa-48c9-a4b7-fc42f8ca47fd") : failed to sync configmap cache: timed out waiting for the condition Mar 16 00:10:33 crc kubenswrapper[4816]: E0316 00:10:33.376268 4816 secret.go:188] Couldn't get secret openshift-kube-controller-manager-operator/kube-controller-manager-operator-serving-cert: failed to sync secret cache: timed out waiting for the condition Mar 16 00:10:33 crc kubenswrapper[4816]: E0316 00:10:33.376290 4816 secret.go:188] Couldn't get secret openshift-service-ca-operator/serving-cert: failed to sync secret cache: timed out waiting for the condition Mar 16 00:10:33 crc kubenswrapper[4816]: E0316 00:10:33.376350 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/79ec9746-96c0-4fcd-b367-a42b6950145a-serving-cert podName:79ec9746-96c0-4fcd-b367-a42b6950145a nodeName:}" failed. No retries permitted until 2026-03-16 00:10:33.876324704 +0000 UTC m=+226.972624667 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/79ec9746-96c0-4fcd-b367-a42b6950145a-serving-cert") pod "service-ca-operator-777779d784-vvdz2" (UID: "79ec9746-96c0-4fcd-b367-a42b6950145a") : failed to sync secret cache: timed out waiting for the condition Mar 16 00:10:33 crc kubenswrapper[4816]: E0316 00:10:33.376369 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/069c0b04-3302-488c-84fc-eeccac5fae9b-serving-cert podName:069c0b04-3302-488c-84fc-eeccac5fae9b nodeName:}" failed. No retries permitted until 2026-03-16 00:10:33.876360325 +0000 UTC m=+226.972660278 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/069c0b04-3302-488c-84fc-eeccac5fae9b-serving-cert") pod "kube-controller-manager-operator-78b949d7b-fk9l7" (UID: "069c0b04-3302-488c-84fc-eeccac5fae9b") : failed to sync secret cache: timed out waiting for the condition Mar 16 00:10:33 crc kubenswrapper[4816]: E0316 00:10:33.378728 4816 configmap.go:193] Couldn't get configMap openshift-kube-controller-manager-operator/kube-controller-manager-operator-config: failed to sync configmap cache: timed out waiting for the condition Mar 16 00:10:33 crc kubenswrapper[4816]: E0316 00:10:33.378796 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/069c0b04-3302-488c-84fc-eeccac5fae9b-config podName:069c0b04-3302-488c-84fc-eeccac5fae9b nodeName:}" failed. No retries permitted until 2026-03-16 00:10:33.878777601 +0000 UTC m=+226.975077554 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config" (UniqueName: "kubernetes.io/configmap/069c0b04-3302-488c-84fc-eeccac5fae9b-config") pod "kube-controller-manager-operator-78b949d7b-fk9l7" (UID: "069c0b04-3302-488c-84fc-eeccac5fae9b") : failed to sync configmap cache: timed out waiting for the condition Mar 16 00:10:33 crc kubenswrapper[4816]: I0316 00:10:33.379779 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Mar 16 00:10:33 crc kubenswrapper[4816]: E0316 00:10:33.379811 4816 secret.go:188] Couldn't get secret openshift-service-ca/signing-key: failed to sync secret cache: timed out waiting for the condition Mar 16 00:10:33 crc kubenswrapper[4816]: E0316 00:10:33.379914 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/db720c64-b1fa-48c9-a4b7-fc42f8ca47fd-signing-key podName:db720c64-b1fa-48c9-a4b7-fc42f8ca47fd nodeName:}" failed. No retries permitted until 2026-03-16 00:10:33.879887721 +0000 UTC m=+226.976187704 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "signing-key" (UniqueName: "kubernetes.io/secret/db720c64-b1fa-48c9-a4b7-fc42f8ca47fd-signing-key") pod "service-ca-9c57cc56f-6nkm6" (UID: "db720c64-b1fa-48c9-a4b7-fc42f8ca47fd") : failed to sync secret cache: timed out waiting for the condition Mar 16 00:10:33 crc kubenswrapper[4816]: E0316 00:10:33.382196 4816 configmap.go:193] Couldn't get configMap openshift-marketplace/marketplace-trusted-ca: failed to sync configmap cache: timed out waiting for the condition Mar 16 00:10:33 crc kubenswrapper[4816]: E0316 00:10:33.382256 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/02854230-6165-4f22-8780-d8591b991132-marketplace-trusted-ca podName:02854230-6165-4f22-8780-d8591b991132 nodeName:}" failed. No retries permitted until 2026-03-16 00:10:33.882242785 +0000 UTC m=+226.978542748 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "marketplace-trusted-ca" (UniqueName: "kubernetes.io/configmap/02854230-6165-4f22-8780-d8591b991132-marketplace-trusted-ca") pod "marketplace-operator-79b997595-8226q" (UID: "02854230-6165-4f22-8780-d8591b991132") : failed to sync configmap cache: timed out waiting for the condition Mar 16 00:10:33 crc kubenswrapper[4816]: E0316 00:10:33.382731 4816 secret.go:188] Couldn't get secret openshift-marketplace/marketplace-operator-metrics: failed to sync secret cache: timed out waiting for the condition Mar 16 00:10:33 crc kubenswrapper[4816]: E0316 00:10:33.382807 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/02854230-6165-4f22-8780-d8591b991132-marketplace-operator-metrics podName:02854230-6165-4f22-8780-d8591b991132 nodeName:}" failed. No retries permitted until 2026-03-16 00:10:33.8827914 +0000 UTC m=+226.979091413 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "marketplace-operator-metrics" (UniqueName: "kubernetes.io/secret/02854230-6165-4f22-8780-d8591b991132-marketplace-operator-metrics") pod "marketplace-operator-79b997595-8226q" (UID: "02854230-6165-4f22-8780-d8591b991132") : failed to sync secret cache: timed out waiting for the condition Mar 16 00:10:33 crc kubenswrapper[4816]: E0316 00:10:33.384096 4816 secret.go:188] Couldn't get secret openshift-multus/multus-admission-controller-secret: failed to sync secret cache: timed out waiting for the condition Mar 16 00:10:33 crc kubenswrapper[4816]: E0316 00:10:33.384113 4816 configmap.go:193] Couldn't get configMap openshift-service-ca-operator/service-ca-operator-config: failed to sync configmap cache: timed out waiting for the condition Mar 16 00:10:33 crc kubenswrapper[4816]: E0316 00:10:33.384171 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7442ef1b-27ea-4166-8457-5332c4c8f363-webhook-certs podName:7442ef1b-27ea-4166-8457-5332c4c8f363 nodeName:}" failed. No retries permitted until 2026-03-16 00:10:33.884154078 +0000 UTC m=+226.980454071 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/7442ef1b-27ea-4166-8457-5332c4c8f363-webhook-certs") pod "multus-admission-controller-857f4d67dd-mplx7" (UID: "7442ef1b-27ea-4166-8457-5332c4c8f363") : failed to sync secret cache: timed out waiting for the condition Mar 16 00:10:33 crc kubenswrapper[4816]: E0316 00:10:33.384211 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/79ec9746-96c0-4fcd-b367-a42b6950145a-config podName:79ec9746-96c0-4fcd-b367-a42b6950145a nodeName:}" failed. No retries permitted until 2026-03-16 00:10:33.884184658 +0000 UTC m=+226.980484682 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config" (UniqueName: "kubernetes.io/configmap/79ec9746-96c0-4fcd-b367-a42b6950145a-config") pod "service-ca-operator-777779d784-vvdz2" (UID: "79ec9746-96c0-4fcd-b367-a42b6950145a") : failed to sync configmap cache: timed out waiting for the condition Mar 16 00:10:33 crc kubenswrapper[4816]: I0316 00:10:33.389428 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3eb76779-61d8-4977-8839-083fcf6cd69b-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-8tt9t\" (UID: \"3eb76779-61d8-4977-8839-083fcf6cd69b\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-8tt9t" Mar 16 00:10:33 crc kubenswrapper[4816]: I0316 00:10:33.397257 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Mar 16 00:10:33 crc kubenswrapper[4816]: I0316 00:10:33.417915 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Mar 16 00:10:33 crc kubenswrapper[4816]: I0316 00:10:33.438898 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Mar 16 00:10:33 crc kubenswrapper[4816]: I0316 00:10:33.467126 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Mar 16 00:10:33 crc kubenswrapper[4816]: I0316 00:10:33.477544 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Mar 16 00:10:33 crc kubenswrapper[4816]: I0316 00:10:33.497997 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Mar 16 00:10:33 crc kubenswrapper[4816]: I0316 00:10:33.517624 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Mar 16 00:10:33 crc kubenswrapper[4816]: I0316 00:10:33.536753 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Mar 16 00:10:33 crc kubenswrapper[4816]: I0316 00:10:33.556768 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Mar 16 00:10:33 crc kubenswrapper[4816]: I0316 00:10:33.578899 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Mar 16 00:10:33 crc kubenswrapper[4816]: I0316 00:10:33.597506 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Mar 16 00:10:33 crc kubenswrapper[4816]: I0316 00:10:33.617472 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Mar 16 00:10:33 crc kubenswrapper[4816]: I0316 00:10:33.637204 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Mar 16 00:10:33 crc kubenswrapper[4816]: I0316 00:10:33.658067 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Mar 16 00:10:33 crc kubenswrapper[4816]: I0316 00:10:33.677620 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 16 00:10:33 crc kubenswrapper[4816]: I0316 00:10:33.696872 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 16 00:10:33 crc kubenswrapper[4816]: I0316 00:10:33.730823 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vxgff\" (UniqueName: \"kubernetes.io/projected/c71f28a1-a68d-41c2-a9e6-4984e2e22c74-kube-api-access-vxgff\") pod \"etcd-operator-b45778765-2l7nk\" (UID: \"c71f28a1-a68d-41c2-a9e6-4984e2e22c74\") " pod="openshift-etcd-operator/etcd-operator-b45778765-2l7nk" Mar 16 00:10:33 crc kubenswrapper[4816]: I0316 00:10:33.737668 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Mar 16 00:10:33 crc kubenswrapper[4816]: I0316 00:10:33.757259 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Mar 16 00:10:33 crc kubenswrapper[4816]: I0316 00:10:33.777292 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Mar 16 00:10:33 crc kubenswrapper[4816]: I0316 00:10:33.798692 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Mar 16 00:10:33 crc kubenswrapper[4816]: I0316 00:10:33.819349 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Mar 16 00:10:33 crc kubenswrapper[4816]: I0316 00:10:33.836964 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Mar 16 00:10:33 crc kubenswrapper[4816]: I0316 00:10:33.857770 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Mar 16 00:10:33 crc kubenswrapper[4816]: I0316 00:10:33.893078 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9r9tr\" (UniqueName: \"kubernetes.io/projected/1f26ea52-1f97-4d4a-98bd-897c5b3b88c5-kube-api-access-9r9tr\") pod \"apiserver-76f77b778f-pdm8d\" (UID: \"1f26ea52-1f97-4d4a-98bd-897c5b3b88c5\") " pod="openshift-apiserver/apiserver-76f77b778f-pdm8d" Mar 16 00:10:33 crc kubenswrapper[4816]: I0316 00:10:33.898284 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-2l7nk" Mar 16 00:10:33 crc kubenswrapper[4816]: I0316 00:10:33.904903 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/7442ef1b-27ea-4166-8457-5332c4c8f363-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-mplx7\" (UID: \"7442ef1b-27ea-4166-8457-5332c4c8f363\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-mplx7" Mar 16 00:10:33 crc kubenswrapper[4816]: I0316 00:10:33.905014 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/79ec9746-96c0-4fcd-b367-a42b6950145a-config\") pod \"service-ca-operator-777779d784-vvdz2\" (UID: \"79ec9746-96c0-4fcd-b367-a42b6950145a\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-vvdz2" Mar 16 00:10:33 crc kubenswrapper[4816]: I0316 00:10:33.905071 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/db720c64-b1fa-48c9-a4b7-fc42f8ca47fd-signing-cabundle\") pod \"service-ca-9c57cc56f-6nkm6\" (UID: \"db720c64-b1fa-48c9-a4b7-fc42f8ca47fd\") " pod="openshift-service-ca/service-ca-9c57cc56f-6nkm6" Mar 16 00:10:33 crc kubenswrapper[4816]: I0316 00:10:33.905140 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/069c0b04-3302-488c-84fc-eeccac5fae9b-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-fk9l7\" (UID: \"069c0b04-3302-488c-84fc-eeccac5fae9b\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-fk9l7" Mar 16 00:10:33 crc kubenswrapper[4816]: I0316 00:10:33.905162 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/79ec9746-96c0-4fcd-b367-a42b6950145a-serving-cert\") pod \"service-ca-operator-777779d784-vvdz2\" (UID: \"79ec9746-96c0-4fcd-b367-a42b6950145a\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-vvdz2" Mar 16 00:10:33 crc kubenswrapper[4816]: I0316 00:10:33.905223 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/069c0b04-3302-488c-84fc-eeccac5fae9b-config\") pod \"kube-controller-manager-operator-78b949d7b-fk9l7\" (UID: \"069c0b04-3302-488c-84fc-eeccac5fae9b\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-fk9l7" Mar 16 00:10:33 crc kubenswrapper[4816]: I0316 00:10:33.905311 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/db720c64-b1fa-48c9-a4b7-fc42f8ca47fd-signing-key\") pod \"service-ca-9c57cc56f-6nkm6\" (UID: \"db720c64-b1fa-48c9-a4b7-fc42f8ca47fd\") " pod="openshift-service-ca/service-ca-9c57cc56f-6nkm6" Mar 16 00:10:33 crc kubenswrapper[4816]: I0316 00:10:33.905417 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/02854230-6165-4f22-8780-d8591b991132-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-8226q\" (UID: \"02854230-6165-4f22-8780-d8591b991132\") " pod="openshift-marketplace/marketplace-operator-79b997595-8226q" Mar 16 00:10:33 crc kubenswrapper[4816]: I0316 00:10:33.905452 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/02854230-6165-4f22-8780-d8591b991132-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-8226q\" (UID: \"02854230-6165-4f22-8780-d8591b991132\") " pod="openshift-marketplace/marketplace-operator-79b997595-8226q" Mar 16 00:10:33 crc kubenswrapper[4816]: I0316 00:10:33.906107 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/79ec9746-96c0-4fcd-b367-a42b6950145a-config\") pod \"service-ca-operator-777779d784-vvdz2\" (UID: \"79ec9746-96c0-4fcd-b367-a42b6950145a\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-vvdz2" Mar 16 00:10:33 crc kubenswrapper[4816]: I0316 00:10:33.906236 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/069c0b04-3302-488c-84fc-eeccac5fae9b-config\") pod \"kube-controller-manager-operator-78b949d7b-fk9l7\" (UID: \"069c0b04-3302-488c-84fc-eeccac5fae9b\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-fk9l7" Mar 16 00:10:33 crc kubenswrapper[4816]: I0316 00:10:33.906790 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/02854230-6165-4f22-8780-d8591b991132-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-8226q\" (UID: \"02854230-6165-4f22-8780-d8591b991132\") " pod="openshift-marketplace/marketplace-operator-79b997595-8226q" Mar 16 00:10:33 crc kubenswrapper[4816]: I0316 00:10:33.906867 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/db720c64-b1fa-48c9-a4b7-fc42f8ca47fd-signing-cabundle\") pod \"service-ca-9c57cc56f-6nkm6\" (UID: \"db720c64-b1fa-48c9-a4b7-fc42f8ca47fd\") " pod="openshift-service-ca/service-ca-9c57cc56f-6nkm6" Mar 16 00:10:33 crc kubenswrapper[4816]: I0316 00:10:33.908286 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/79ec9746-96c0-4fcd-b367-a42b6950145a-serving-cert\") pod \"service-ca-operator-777779d784-vvdz2\" (UID: \"79ec9746-96c0-4fcd-b367-a42b6950145a\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-vvdz2" Mar 16 00:10:33 crc kubenswrapper[4816]: I0316 00:10:33.908371 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/069c0b04-3302-488c-84fc-eeccac5fae9b-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-fk9l7\" (UID: \"069c0b04-3302-488c-84fc-eeccac5fae9b\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-fk9l7" Mar 16 00:10:33 crc kubenswrapper[4816]: I0316 00:10:33.908766 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/db720c64-b1fa-48c9-a4b7-fc42f8ca47fd-signing-key\") pod \"service-ca-9c57cc56f-6nkm6\" (UID: \"db720c64-b1fa-48c9-a4b7-fc42f8ca47fd\") " pod="openshift-service-ca/service-ca-9c57cc56f-6nkm6" Mar 16 00:10:33 crc kubenswrapper[4816]: I0316 00:10:33.908905 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/7442ef1b-27ea-4166-8457-5332c4c8f363-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-mplx7\" (UID: \"7442ef1b-27ea-4166-8457-5332c4c8f363\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-mplx7" Mar 16 00:10:33 crc kubenswrapper[4816]: I0316 00:10:33.909242 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/02854230-6165-4f22-8780-d8591b991132-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-8226q\" (UID: \"02854230-6165-4f22-8780-d8591b991132\") " pod="openshift-marketplace/marketplace-operator-79b997595-8226q" Mar 16 00:10:33 crc kubenswrapper[4816]: I0316 00:10:33.916613 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Mar 16 00:10:33 crc kubenswrapper[4816]: I0316 00:10:33.936946 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Mar 16 00:10:33 crc kubenswrapper[4816]: I0316 00:10:33.948647 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-pdm8d" Mar 16 00:10:33 crc kubenswrapper[4816]: I0316 00:10:33.957009 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Mar 16 00:10:33 crc kubenswrapper[4816]: I0316 00:10:33.977564 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Mar 16 00:10:34 crc kubenswrapper[4816]: I0316 00:10:34.012484 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2zhnp\" (UniqueName: \"kubernetes.io/projected/dced2102-9fd0-4300-9e0a-35d915f1caad-kube-api-access-2zhnp\") pod \"apiserver-7bbb656c7d-zrq8d\" (UID: \"dced2102-9fd0-4300-9e0a-35d915f1caad\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zrq8d" Mar 16 00:10:34 crc kubenswrapper[4816]: I0316 00:10:34.038506 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bkbl4\" (UniqueName: \"kubernetes.io/projected/c1674a73-a65c-4a8d-9dc5-af576a7af7d4-kube-api-access-bkbl4\") pod \"machine-approver-56656f9798-n7fkv\" (UID: \"c1674a73-a65c-4a8d-9dc5-af576a7af7d4\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-n7fkv" Mar 16 00:10:34 crc kubenswrapper[4816]: I0316 00:10:34.053913 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zrq8d" Mar 16 00:10:34 crc kubenswrapper[4816]: I0316 00:10:34.056674 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tcsz7\" (UniqueName: \"kubernetes.io/projected/5e41d768-3ed4-4760-a0d5-4308d7b13379-kube-api-access-tcsz7\") pod \"cluster-image-registry-operator-dc59b4c8b-s8mgz\" (UID: \"5e41d768-3ed4-4760-a0d5-4308d7b13379\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-s8mgz" Mar 16 00:10:34 crc kubenswrapper[4816]: I0316 00:10:34.081101 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xdmv5\" (UniqueName: \"kubernetes.io/projected/171b00f7-f7cf-41b3-bffd-11ceeb9f2182-kube-api-access-xdmv5\") pod \"dns-operator-744455d44c-fnmb9\" (UID: \"171b00f7-f7cf-41b3-bffd-11ceeb9f2182\") " pod="openshift-dns-operator/dns-operator-744455d44c-fnmb9" Mar 16 00:10:34 crc kubenswrapper[4816]: I0316 00:10:34.090942 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-2l7nk"] Mar 16 00:10:34 crc kubenswrapper[4816]: I0316 00:10:34.095330 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9vhmx\" (UniqueName: \"kubernetes.io/projected/b98dcc1e-4c4b-47eb-9ddf-59a138f94247-kube-api-access-9vhmx\") pod \"cluster-samples-operator-665b6dd947-v8zx4\" (UID: \"b98dcc1e-4c4b-47eb-9ddf-59a138f94247\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-v8zx4" Mar 16 00:10:34 crc kubenswrapper[4816]: I0316 00:10:34.110875 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2jc6t\" (UniqueName: \"kubernetes.io/projected/1306b657-0022-435d-bb72-793f1c1a106b-kube-api-access-2jc6t\") pod \"machine-api-operator-5694c8668f-9xv4p\" (UID: \"1306b657-0022-435d-bb72-793f1c1a106b\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-9xv4p" Mar 16 00:10:34 crc kubenswrapper[4816]: I0316 00:10:34.112099 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-pdm8d"] Mar 16 00:10:34 crc kubenswrapper[4816]: I0316 00:10:34.115752 4816 request.go:700] Waited for 1.84377791s due to client-side throttling, not priority and fairness, request: POST:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-console/serviceaccounts/console/token Mar 16 00:10:34 crc kubenswrapper[4816]: W0316 00:10:34.120468 4816 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1f26ea52_1f97_4d4a_98bd_897c5b3b88c5.slice/crio-227d11413d29bca1a6b2536f1ef23d7f212990e7b63c387b2647895501abe9e2 WatchSource:0}: Error finding container 227d11413d29bca1a6b2536f1ef23d7f212990e7b63c387b2647895501abe9e2: Status 404 returned error can't find the container with id 227d11413d29bca1a6b2536f1ef23d7f212990e7b63c387b2647895501abe9e2 Mar 16 00:10:34 crc kubenswrapper[4816]: I0316 00:10:34.134226 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xnq8l\" (UniqueName: \"kubernetes.io/projected/32cac2d0-56f0-4ba8-86dc-9b57c0fcc11c-kube-api-access-xnq8l\") pod \"console-f9d7485db-nnqsw\" (UID: \"32cac2d0-56f0-4ba8-86dc-9b57c0fcc11c\") " pod="openshift-console/console-f9d7485db-nnqsw" Mar 16 00:10:34 crc kubenswrapper[4816]: I0316 00:10:34.151545 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/5e41d768-3ed4-4760-a0d5-4308d7b13379-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-s8mgz\" (UID: \"5e41d768-3ed4-4760-a0d5-4308d7b13379\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-s8mgz" Mar 16 00:10:34 crc kubenswrapper[4816]: I0316 00:10:34.171316 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-9xv4p" Mar 16 00:10:34 crc kubenswrapper[4816]: I0316 00:10:34.171915 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fvtdx\" (UniqueName: \"kubernetes.io/projected/1d5466ab-a589-4f7e-ae89-2f494b10f6b1-kube-api-access-fvtdx\") pod \"route-controller-manager-6576b87f9c-d9j8j\" (UID: \"1d5466ab-a589-4f7e-ae89-2f494b10f6b1\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-d9j8j" Mar 16 00:10:34 crc kubenswrapper[4816]: I0316 00:10:34.191564 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fht74\" (UniqueName: \"kubernetes.io/projected/005900fa-b395-4c1c-8e62-8e975bd0393c-kube-api-access-fht74\") pod \"openshift-apiserver-operator-796bbdcf4f-rqxss\" (UID: \"005900fa-b395-4c1c-8e62-8e975bd0393c\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-rqxss" Mar 16 00:10:34 crc kubenswrapper[4816]: I0316 00:10:34.212178 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pzj4l\" (UniqueName: \"kubernetes.io/projected/044562bd-df74-47fa-bc8d-1c652233e9c5-kube-api-access-pzj4l\") pod \"authentication-operator-69f744f599-6r96v\" (UID: \"044562bd-df74-47fa-bc8d-1c652233e9c5\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-6r96v" Mar 16 00:10:34 crc kubenswrapper[4816]: I0316 00:10:34.216808 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Mar 16 00:10:34 crc kubenswrapper[4816]: I0316 00:10:34.236662 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-v8zx4" Mar 16 00:10:34 crc kubenswrapper[4816]: I0316 00:10:34.236867 4816 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Mar 16 00:10:34 crc kubenswrapper[4816]: I0316 00:10:34.257562 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Mar 16 00:10:34 crc kubenswrapper[4816]: I0316 00:10:34.258595 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-zrq8d"] Mar 16 00:10:34 crc kubenswrapper[4816]: I0316 00:10:34.276391 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Mar 16 00:10:34 crc kubenswrapper[4816]: I0316 00:10:34.297027 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Mar 16 00:10:34 crc kubenswrapper[4816]: I0316 00:10:34.317079 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Mar 16 00:10:34 crc kubenswrapper[4816]: I0316 00:10:34.326826 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-n7fkv" Mar 16 00:10:34 crc kubenswrapper[4816]: I0316 00:10:34.336910 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Mar 16 00:10:34 crc kubenswrapper[4816]: I0316 00:10:34.345205 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-fnmb9" Mar 16 00:10:34 crc kubenswrapper[4816]: W0316 00:10:34.345757 4816 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc1674a73_a65c_4a8d_9dc5_af576a7af7d4.slice/crio-156c21f74efb92d4cd61293079b23b80bcd12503e81ebb0027270843d47d9d60 WatchSource:0}: Error finding container 156c21f74efb92d4cd61293079b23b80bcd12503e81ebb0027270843d47d9d60: Status 404 returned error can't find the container with id 156c21f74efb92d4cd61293079b23b80bcd12503e81ebb0027270843d47d9d60 Mar 16 00:10:34 crc kubenswrapper[4816]: I0316 00:10:34.358001 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Mar 16 00:10:34 crc kubenswrapper[4816]: I0316 00:10:34.359290 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-d9j8j" Mar 16 00:10:34 crc kubenswrapper[4816]: I0316 00:10:34.378062 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Mar 16 00:10:34 crc kubenswrapper[4816]: I0316 00:10:34.380748 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-9xv4p"] Mar 16 00:10:34 crc kubenswrapper[4816]: W0316 00:10:34.393427 4816 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1306b657_0022_435d_bb72_793f1c1a106b.slice/crio-64b816ad085833cd80a47777fbb505eb9e3f5c4e4e78be37220a15c940992742 WatchSource:0}: Error finding container 64b816ad085833cd80a47777fbb505eb9e3f5c4e4e78be37220a15c940992742: Status 404 returned error can't find the container with id 64b816ad085833cd80a47777fbb505eb9e3f5c4e4e78be37220a15c940992742 Mar 16 00:10:34 crc kubenswrapper[4816]: I0316 00:10:34.402630 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-s8mgz" Mar 16 00:10:34 crc kubenswrapper[4816]: I0316 00:10:34.412430 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-v8zx4"] Mar 16 00:10:34 crc kubenswrapper[4816]: I0316 00:10:34.416817 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dq52v\" (UniqueName: \"kubernetes.io/projected/55e76e8f-7d69-4f55-81f8-45c9c612876b-kube-api-access-dq52v\") pod \"auto-csr-approver-29560330-44pts\" (UID: \"55e76e8f-7d69-4f55-81f8-45c9c612876b\") " pod="openshift-infra/auto-csr-approver-29560330-44pts" Mar 16 00:10:34 crc kubenswrapper[4816]: I0316 00:10:34.418402 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-nnqsw" Mar 16 00:10:34 crc kubenswrapper[4816]: I0316 00:10:34.452493 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/93dadffe-0353-4301-bb97-31b034d3dc64-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-t8c6x\" (UID: \"93dadffe-0353-4301-bb97-31b034d3dc64\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-t8c6x" Mar 16 00:10:34 crc kubenswrapper[4816]: I0316 00:10:34.457016 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ct69q\" (UniqueName: \"kubernetes.io/projected/7442ef1b-27ea-4166-8457-5332c4c8f363-kube-api-access-ct69q\") pod \"multus-admission-controller-857f4d67dd-mplx7\" (UID: \"7442ef1b-27ea-4166-8457-5332c4c8f363\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-mplx7" Mar 16 00:10:34 crc kubenswrapper[4816]: I0316 00:10:34.460938 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-6r96v" Mar 16 00:10:34 crc kubenswrapper[4816]: I0316 00:10:34.479070 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n5mpc\" (UniqueName: \"kubernetes.io/projected/681ca8e4-f909-4e8b-9f35-5ab8ca382e44-kube-api-access-n5mpc\") pod \"router-default-5444994796-gvk75\" (UID: \"681ca8e4-f909-4e8b-9f35-5ab8ca382e44\") " pod="openshift-ingress/router-default-5444994796-gvk75" Mar 16 00:10:34 crc kubenswrapper[4816]: I0316 00:10:34.483039 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-rqxss" Mar 16 00:10:34 crc kubenswrapper[4816]: I0316 00:10:34.498153 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4wvzp\" (UniqueName: \"kubernetes.io/projected/34f93b2b-cc36-4965-992c-825bf2595e1e-kube-api-access-4wvzp\") pod \"machine-config-controller-84d6567774-tgbjg\" (UID: \"34f93b2b-cc36-4965-992c-825bf2595e1e\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-tgbjg" Mar 16 00:10:34 crc kubenswrapper[4816]: I0316 00:10:34.513856 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-tgbjg" Mar 16 00:10:34 crc kubenswrapper[4816]: I0316 00:10:34.521289 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-t8c6x" Mar 16 00:10:34 crc kubenswrapper[4816]: I0316 00:10:34.522087 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9vkgk\" (UniqueName: \"kubernetes.io/projected/0ec3cdc0-f024-43cf-b520-7d2437e0f8df-kube-api-access-9vkgk\") pod \"downloads-7954f5f757-5rr7c\" (UID: \"0ec3cdc0-f024-43cf-b520-7d2437e0f8df\") " pod="openshift-console/downloads-7954f5f757-5rr7c" Mar 16 00:10:34 crc kubenswrapper[4816]: I0316 00:10:34.550169 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-gvk75" Mar 16 00:10:34 crc kubenswrapper[4816]: I0316 00:10:34.556310 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hxxlh\" (UniqueName: \"kubernetes.io/projected/f4bdfe91-48ef-4a44-99ba-d7ab90df9ec0-kube-api-access-hxxlh\") pod \"machine-config-operator-74547568cd-4t6gm\" (UID: \"f4bdfe91-48ef-4a44-99ba-d7ab90df9ec0\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-4t6gm" Mar 16 00:10:34 crc kubenswrapper[4816]: I0316 00:10:34.561036 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/069c0b04-3302-488c-84fc-eeccac5fae9b-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-fk9l7\" (UID: \"069c0b04-3302-488c-84fc-eeccac5fae9b\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-fk9l7" Mar 16 00:10:34 crc kubenswrapper[4816]: I0316 00:10:34.579020 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-86fb4\" (UniqueName: \"kubernetes.io/projected/4cc341f9-55c7-4bce-a0e3-24df68ca7f0e-kube-api-access-86fb4\") pod \"console-operator-58897d9998-q9xc9\" (UID: \"4cc341f9-55c7-4bce-a0e3-24df68ca7f0e\") " pod="openshift-console-operator/console-operator-58897d9998-q9xc9" Mar 16 00:10:34 crc kubenswrapper[4816]: I0316 00:10:34.592877 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-fnmb9"] Mar 16 00:10:34 crc kubenswrapper[4816]: I0316 00:10:34.604655 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-fk9l7" Mar 16 00:10:34 crc kubenswrapper[4816]: I0316 00:10:34.607910 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zsbsw\" (UniqueName: \"kubernetes.io/projected/59c840a8-f288-44ed-83d3-34d47041c6c6-kube-api-access-zsbsw\") pod \"controller-manager-879f6c89f-tv2n7\" (UID: \"59c840a8-f288-44ed-83d3-34d47041c6c6\") " pod="openshift-controller-manager/controller-manager-879f6c89f-tv2n7" Mar 16 00:10:34 crc kubenswrapper[4816]: I0316 00:10:34.620022 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-d9j8j"] Mar 16 00:10:34 crc kubenswrapper[4816]: I0316 00:10:34.626435 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xxqlb\" (UniqueName: \"kubernetes.io/projected/29bcfe72-6ef1-4087-9feb-787fdba3d2d7-kube-api-access-xxqlb\") pod \"catalog-operator-68c6474976-gtwmf\" (UID: \"29bcfe72-6ef1-4087-9feb-787fdba3d2d7\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-gtwmf" Mar 16 00:10:34 crc kubenswrapper[4816]: I0316 00:10:34.630538 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29560330-44pts" Mar 16 00:10:34 crc kubenswrapper[4816]: I0316 00:10:34.632942 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/ef3a7303-57a8-461f-86c1-fd3f7882e93b-bound-sa-token\") pod \"ingress-operator-5b745b69d9-l648b\" (UID: \"ef3a7303-57a8-461f-86c1-fd3f7882e93b\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-l648b" Mar 16 00:10:34 crc kubenswrapper[4816]: I0316 00:10:34.646164 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-mplx7" Mar 16 00:10:34 crc kubenswrapper[4816]: I0316 00:10:34.659690 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fskkz\" (UniqueName: \"kubernetes.io/projected/ef3a7303-57a8-461f-86c1-fd3f7882e93b-kube-api-access-fskkz\") pod \"ingress-operator-5b745b69d9-l648b\" (UID: \"ef3a7303-57a8-461f-86c1-fd3f7882e93b\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-l648b" Mar 16 00:10:34 crc kubenswrapper[4816]: I0316 00:10:34.681256 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nfjfn\" (UniqueName: \"kubernetes.io/projected/4f90d894-17c6-4800-a438-737fe8619e01-kube-api-access-nfjfn\") pod \"openshift-config-operator-7777fb866f-d4vrm\" (UID: \"4f90d894-17c6-4800-a438-737fe8619e01\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-d4vrm" Mar 16 00:10:34 crc kubenswrapper[4816]: I0316 00:10:34.694800 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dwr4x\" (UniqueName: \"kubernetes.io/projected/064b42ee-720b-456c-8ffe-a247f827befc-kube-api-access-dwr4x\") pod \"packageserver-d55dfcdfc-zn6w7\" (UID: \"064b42ee-720b-456c-8ffe-a247f827befc\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-zn6w7" Mar 16 00:10:34 crc kubenswrapper[4816]: I0316 00:10:34.709932 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-tv2n7" Mar 16 00:10:34 crc kubenswrapper[4816]: I0316 00:10:34.721137 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2tj47\" (UniqueName: \"kubernetes.io/projected/9fc59286-0388-4519-afc7-f2c8cf80ab40-kube-api-access-2tj47\") pod \"image-pruner-29560320-s9q72\" (UID: \"9fc59286-0388-4519-afc7-f2c8cf80ab40\") " pod="openshift-image-registry/image-pruner-29560320-s9q72" Mar 16 00:10:34 crc kubenswrapper[4816]: I0316 00:10:34.722929 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-q9xc9" Mar 16 00:10:34 crc kubenswrapper[4816]: I0316 00:10:34.732631 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-pruner-29560320-s9q72" Mar 16 00:10:34 crc kubenswrapper[4816]: I0316 00:10:34.733355 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/3eb76779-61d8-4977-8839-083fcf6cd69b-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-8tt9t\" (UID: \"3eb76779-61d8-4977-8839-083fcf6cd69b\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-8tt9t" Mar 16 00:10:34 crc kubenswrapper[4816]: I0316 00:10:34.746852 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-5rr7c" Mar 16 00:10:34 crc kubenswrapper[4816]: I0316 00:10:34.749522 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zrq8d" event={"ID":"dced2102-9fd0-4300-9e0a-35d915f1caad","Type":"ContainerStarted","Data":"0f46d32a82d22035ebca6b72f5d02298b879e23dc630a2d6147dc46c9ae24083"} Mar 16 00:10:34 crc kubenswrapper[4816]: I0316 00:10:34.750724 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-n7fkv" event={"ID":"c1674a73-a65c-4a8d-9dc5-af576a7af7d4","Type":"ContainerStarted","Data":"156c21f74efb92d4cd61293079b23b80bcd12503e81ebb0027270843d47d9d60"} Mar 16 00:10:34 crc kubenswrapper[4816]: I0316 00:10:34.751873 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-9xv4p" event={"ID":"1306b657-0022-435d-bb72-793f1c1a106b","Type":"ContainerStarted","Data":"64b816ad085833cd80a47777fbb505eb9e3f5c4e4e78be37220a15c940992742"} Mar 16 00:10:34 crc kubenswrapper[4816]: I0316 00:10:34.752100 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-d4vrm" Mar 16 00:10:34 crc kubenswrapper[4816]: I0316 00:10:34.753581 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-2l7nk" event={"ID":"c71f28a1-a68d-41c2-a9e6-4984e2e22c74","Type":"ContainerStarted","Data":"013ba8198b1ad95322c8c4ed9ed54ee95419f7fa9530c9926a8a7542e0bd3fdb"} Mar 16 00:10:34 crc kubenswrapper[4816]: I0316 00:10:34.753608 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-2l7nk" event={"ID":"c71f28a1-a68d-41c2-a9e6-4984e2e22c74","Type":"ContainerStarted","Data":"ecadbb640205a413a401e341202c47a74197065f9e52bbe132223e6f5560a08b"} Mar 16 00:10:34 crc kubenswrapper[4816]: I0316 00:10:34.754992 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bs9wq\" (UniqueName: \"kubernetes.io/projected/aa9721ae-1aaa-4d49-9f05-5aabb9ab31d9-kube-api-access-bs9wq\") pod \"kube-storage-version-migrator-operator-b67b599dd-ll5r8\" (UID: \"aa9721ae-1aaa-4d49-9f05-5aabb9ab31d9\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-ll5r8" Mar 16 00:10:34 crc kubenswrapper[4816]: I0316 00:10:34.755259 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-fnmb9" event={"ID":"171b00f7-f7cf-41b3-bffd-11ceeb9f2182","Type":"ContainerStarted","Data":"b46ca38a8fb490811ed5bb54ba044f6698f8a29b3e0912b3a995b3f104cd2fa5"} Mar 16 00:10:34 crc kubenswrapper[4816]: I0316 00:10:34.756211 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-gvk75" event={"ID":"681ca8e4-f909-4e8b-9f35-5ab8ca382e44","Type":"ContainerStarted","Data":"0f3dfd3525c221051b9ec7cabcb25168989137878f8bcc52c0c3c3226135fea5"} Mar 16 00:10:34 crc kubenswrapper[4816]: I0316 00:10:34.757398 4816 generic.go:334] "Generic (PLEG): container finished" podID="1f26ea52-1f97-4d4a-98bd-897c5b3b88c5" containerID="973c9812ff199e7d01a75bd951b16fc7b17c1f59dae9b6ee85591b33b9699bf5" exitCode=0 Mar 16 00:10:34 crc kubenswrapper[4816]: I0316 00:10:34.757435 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-pdm8d" event={"ID":"1f26ea52-1f97-4d4a-98bd-897c5b3b88c5","Type":"ContainerDied","Data":"973c9812ff199e7d01a75bd951b16fc7b17c1f59dae9b6ee85591b33b9699bf5"} Mar 16 00:10:34 crc kubenswrapper[4816]: I0316 00:10:34.757461 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-pdm8d" event={"ID":"1f26ea52-1f97-4d4a-98bd-897c5b3b88c5","Type":"ContainerStarted","Data":"227d11413d29bca1a6b2536f1ef23d7f212990e7b63c387b2647895501abe9e2"} Mar 16 00:10:34 crc kubenswrapper[4816]: I0316 00:10:34.767038 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-l648b" Mar 16 00:10:34 crc kubenswrapper[4816]: I0316 00:10:34.772404 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pg5rd\" (UniqueName: \"kubernetes.io/projected/0a98bca9-38d3-4382-a6d6-8410170f7d81-kube-api-access-pg5rd\") pod \"openshift-controller-manager-operator-756b6f6bc6-sh4ps\" (UID: \"0a98bca9-38d3-4382-a6d6-8410170f7d81\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-sh4ps" Mar 16 00:10:34 crc kubenswrapper[4816]: I0316 00:10:34.795368 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wwn7z\" (UniqueName: \"kubernetes.io/projected/79ec9746-96c0-4fcd-b367-a42b6950145a-kube-api-access-wwn7z\") pod \"service-ca-operator-777779d784-vvdz2\" (UID: \"79ec9746-96c0-4fcd-b367-a42b6950145a\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-vvdz2" Mar 16 00:10:34 crc kubenswrapper[4816]: I0316 00:10:34.804328 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-gtwmf" Mar 16 00:10:34 crc kubenswrapper[4816]: I0316 00:10:34.812896 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-spn2v\" (UniqueName: \"kubernetes.io/projected/dc6dfded-ec9e-4a6f-a97b-3bb4cf8a149a-kube-api-access-spn2v\") pod \"control-plane-machine-set-operator-78cbb6b69f-mwhpz\" (UID: \"dc6dfded-ec9e-4a6f-a97b-3bb4cf8a149a\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-mwhpz" Mar 16 00:10:34 crc kubenswrapper[4816]: I0316 00:10:34.830527 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-4t6gm" Mar 16 00:10:34 crc kubenswrapper[4816]: I0316 00:10:34.841577 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pdffs\" (UniqueName: \"kubernetes.io/projected/9397185e-a9e3-4ef4-b0be-d9dc9208adff-kube-api-access-pdffs\") pod \"collect-profiles-29560320-4hk5d\" (UID: \"9397185e-a9e3-4ef4-b0be-d9dc9208adff\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29560320-4hk5d" Mar 16 00:10:34 crc kubenswrapper[4816]: I0316 00:10:34.862950 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-8tt9t" Mar 16 00:10:34 crc kubenswrapper[4816]: I0316 00:10:34.865503 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-ll5r8" Mar 16 00:10:34 crc kubenswrapper[4816]: I0316 00:10:34.868277 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pkph5\" (UniqueName: \"kubernetes.io/projected/db720c64-b1fa-48c9-a4b7-fc42f8ca47fd-kube-api-access-pkph5\") pod \"service-ca-9c57cc56f-6nkm6\" (UID: \"db720c64-b1fa-48c9-a4b7-fc42f8ca47fd\") " pod="openshift-service-ca/service-ca-9c57cc56f-6nkm6" Mar 16 00:10:34 crc kubenswrapper[4816]: I0316 00:10:34.872647 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29560320-4hk5d" Mar 16 00:10:34 crc kubenswrapper[4816]: I0316 00:10:34.878353 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j62nk\" (UniqueName: \"kubernetes.io/projected/0386f821-c5fb-4dfd-acaf-706e214a57c0-kube-api-access-j62nk\") pod \"migrator-59844c95c7-fwkzt\" (UID: \"0386f821-c5fb-4dfd-acaf-706e214a57c0\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-fwkzt" Mar 16 00:10:34 crc kubenswrapper[4816]: I0316 00:10:34.886084 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-zn6w7" Mar 16 00:10:34 crc kubenswrapper[4816]: I0316 00:10:34.910354 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-vvdz2" Mar 16 00:10:34 crc kubenswrapper[4816]: I0316 00:10:34.913969 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rclzn\" (UniqueName: \"kubernetes.io/projected/a1bc4b9a-e741-4f63-81e8-fdce3da0a5ea-kube-api-access-rclzn\") pod \"olm-operator-6b444d44fb-nsxl4\" (UID: \"a1bc4b9a-e741-4f63-81e8-fdce3da0a5ea\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-nsxl4" Mar 16 00:10:34 crc kubenswrapper[4816]: I0316 00:10:34.915003 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-s8mgz"] Mar 16 00:10:34 crc kubenswrapper[4816]: I0316 00:10:34.932740 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wgzct\" (UniqueName: \"kubernetes.io/projected/9e737c04-a2db-452e-adc7-fa383e158b53-kube-api-access-wgzct\") pod \"package-server-manager-789f6589d5-jm8db\" (UID: \"9e737c04-a2db-452e-adc7-fa383e158b53\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-jm8db" Mar 16 00:10:34 crc kubenswrapper[4816]: I0316 00:10:34.940040 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-6nkm6" Mar 16 00:10:34 crc kubenswrapper[4816]: I0316 00:10:34.946781 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zgh99\" (UniqueName: \"kubernetes.io/projected/02854230-6165-4f22-8780-d8591b991132-kube-api-access-zgh99\") pod \"marketplace-operator-79b997595-8226q\" (UID: \"02854230-6165-4f22-8780-d8591b991132\") " pod="openshift-marketplace/marketplace-operator-79b997595-8226q" Mar 16 00:10:34 crc kubenswrapper[4816]: I0316 00:10:34.986972 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-nnqsw"] Mar 16 00:10:35 crc kubenswrapper[4816]: I0316 00:10:35.027909 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/7c3e347f-464a-43f1-bf29-689bf81a28e6-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-sshl5\" (UID: \"7c3e347f-464a-43f1-bf29-689bf81a28e6\") " pod="openshift-authentication/oauth-openshift-558db77b4-sshl5" Mar 16 00:10:35 crc kubenswrapper[4816]: I0316 00:10:35.027991 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/7c3e347f-464a-43f1-bf29-689bf81a28e6-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-sshl5\" (UID: \"7c3e347f-464a-43f1-bf29-689bf81a28e6\") " pod="openshift-authentication/oauth-openshift-558db77b4-sshl5" Mar 16 00:10:35 crc kubenswrapper[4816]: I0316 00:10:35.028022 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/7c3e347f-464a-43f1-bf29-689bf81a28e6-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-sshl5\" (UID: \"7c3e347f-464a-43f1-bf29-689bf81a28e6\") " pod="openshift-authentication/oauth-openshift-558db77b4-sshl5" Mar 16 00:10:35 crc kubenswrapper[4816]: I0316 00:10:35.028045 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/7c3e347f-464a-43f1-bf29-689bf81a28e6-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-sshl5\" (UID: \"7c3e347f-464a-43f1-bf29-689bf81a28e6\") " pod="openshift-authentication/oauth-openshift-558db77b4-sshl5" Mar 16 00:10:35 crc kubenswrapper[4816]: I0316 00:10:35.028071 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9r7jk\" (UniqueName: \"kubernetes.io/projected/b155133b-d494-44bc-aa5d-23efc7cbd7a6-kube-api-access-9r7jk\") pod \"image-registry-697d97f7c8-ckvwn\" (UID: \"b155133b-d494-44bc-aa5d-23efc7cbd7a6\") " pod="openshift-image-registry/image-registry-697d97f7c8-ckvwn" Mar 16 00:10:35 crc kubenswrapper[4816]: I0316 00:10:35.028096 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/7c3e347f-464a-43f1-bf29-689bf81a28e6-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-sshl5\" (UID: \"7c3e347f-464a-43f1-bf29-689bf81a28e6\") " pod="openshift-authentication/oauth-openshift-558db77b4-sshl5" Mar 16 00:10:35 crc kubenswrapper[4816]: I0316 00:10:35.028135 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/7c3e347f-464a-43f1-bf29-689bf81a28e6-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-sshl5\" (UID: \"7c3e347f-464a-43f1-bf29-689bf81a28e6\") " pod="openshift-authentication/oauth-openshift-558db77b4-sshl5" Mar 16 00:10:35 crc kubenswrapper[4816]: I0316 00:10:35.028163 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/b155133b-d494-44bc-aa5d-23efc7cbd7a6-installation-pull-secrets\") pod \"image-registry-697d97f7c8-ckvwn\" (UID: \"b155133b-d494-44bc-aa5d-23efc7cbd7a6\") " pod="openshift-image-registry/image-registry-697d97f7c8-ckvwn" Mar 16 00:10:35 crc kubenswrapper[4816]: I0316 00:10:35.028187 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/b155133b-d494-44bc-aa5d-23efc7cbd7a6-ca-trust-extracted\") pod \"image-registry-697d97f7c8-ckvwn\" (UID: \"b155133b-d494-44bc-aa5d-23efc7cbd7a6\") " pod="openshift-image-registry/image-registry-697d97f7c8-ckvwn" Mar 16 00:10:35 crc kubenswrapper[4816]: I0316 00:10:35.028213 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/b155133b-d494-44bc-aa5d-23efc7cbd7a6-registry-certificates\") pod \"image-registry-697d97f7c8-ckvwn\" (UID: \"b155133b-d494-44bc-aa5d-23efc7cbd7a6\") " pod="openshift-image-registry/image-registry-697d97f7c8-ckvwn" Mar 16 00:10:35 crc kubenswrapper[4816]: I0316 00:10:35.028238 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7c3e347f-464a-43f1-bf29-689bf81a28e6-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-sshl5\" (UID: \"7c3e347f-464a-43f1-bf29-689bf81a28e6\") " pod="openshift-authentication/oauth-openshift-558db77b4-sshl5" Mar 16 00:10:35 crc kubenswrapper[4816]: I0316 00:10:35.028262 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/b155133b-d494-44bc-aa5d-23efc7cbd7a6-registry-tls\") pod \"image-registry-697d97f7c8-ckvwn\" (UID: \"b155133b-d494-44bc-aa5d-23efc7cbd7a6\") " pod="openshift-image-registry/image-registry-697d97f7c8-ckvwn" Mar 16 00:10:35 crc kubenswrapper[4816]: I0316 00:10:35.028285 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/7c3e347f-464a-43f1-bf29-689bf81a28e6-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-sshl5\" (UID: \"7c3e347f-464a-43f1-bf29-689bf81a28e6\") " pod="openshift-authentication/oauth-openshift-558db77b4-sshl5" Mar 16 00:10:35 crc kubenswrapper[4816]: I0316 00:10:35.028321 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/7c3e347f-464a-43f1-bf29-689bf81a28e6-audit-policies\") pod \"oauth-openshift-558db77b4-sshl5\" (UID: \"7c3e347f-464a-43f1-bf29-689bf81a28e6\") " pod="openshift-authentication/oauth-openshift-558db77b4-sshl5" Mar 16 00:10:35 crc kubenswrapper[4816]: I0316 00:10:35.028346 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/7c3e347f-464a-43f1-bf29-689bf81a28e6-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-sshl5\" (UID: \"7c3e347f-464a-43f1-bf29-689bf81a28e6\") " pod="openshift-authentication/oauth-openshift-558db77b4-sshl5" Mar 16 00:10:35 crc kubenswrapper[4816]: I0316 00:10:35.028372 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kxdqt\" (UniqueName: \"kubernetes.io/projected/7c3e347f-464a-43f1-bf29-689bf81a28e6-kube-api-access-kxdqt\") pod \"oauth-openshift-558db77b4-sshl5\" (UID: \"7c3e347f-464a-43f1-bf29-689bf81a28e6\") " pod="openshift-authentication/oauth-openshift-558db77b4-sshl5" Mar 16 00:10:35 crc kubenswrapper[4816]: I0316 00:10:35.028396 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b155133b-d494-44bc-aa5d-23efc7cbd7a6-trusted-ca\") pod \"image-registry-697d97f7c8-ckvwn\" (UID: \"b155133b-d494-44bc-aa5d-23efc7cbd7a6\") " pod="openshift-image-registry/image-registry-697d97f7c8-ckvwn" Mar 16 00:10:35 crc kubenswrapper[4816]: I0316 00:10:35.028419 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/7c3e347f-464a-43f1-bf29-689bf81a28e6-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-sshl5\" (UID: \"7c3e347f-464a-43f1-bf29-689bf81a28e6\") " pod="openshift-authentication/oauth-openshift-558db77b4-sshl5" Mar 16 00:10:35 crc kubenswrapper[4816]: I0316 00:10:35.028446 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/7c3e347f-464a-43f1-bf29-689bf81a28e6-audit-dir\") pod \"oauth-openshift-558db77b4-sshl5\" (UID: \"7c3e347f-464a-43f1-bf29-689bf81a28e6\") " pod="openshift-authentication/oauth-openshift-558db77b4-sshl5" Mar 16 00:10:35 crc kubenswrapper[4816]: I0316 00:10:35.028469 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/7c3e347f-464a-43f1-bf29-689bf81a28e6-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-sshl5\" (UID: \"7c3e347f-464a-43f1-bf29-689bf81a28e6\") " pod="openshift-authentication/oauth-openshift-558db77b4-sshl5" Mar 16 00:10:35 crc kubenswrapper[4816]: I0316 00:10:35.028496 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/b155133b-d494-44bc-aa5d-23efc7cbd7a6-bound-sa-token\") pod \"image-registry-697d97f7c8-ckvwn\" (UID: \"b155133b-d494-44bc-aa5d-23efc7cbd7a6\") " pod="openshift-image-registry/image-registry-697d97f7c8-ckvwn" Mar 16 00:10:35 crc kubenswrapper[4816]: I0316 00:10:35.028531 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ckvwn\" (UID: \"b155133b-d494-44bc-aa5d-23efc7cbd7a6\") " pod="openshift-image-registry/image-registry-697d97f7c8-ckvwn" Mar 16 00:10:35 crc kubenswrapper[4816]: E0316 00:10:35.028898 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-16 00:10:35.528882574 +0000 UTC m=+228.625182527 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ckvwn" (UID: "b155133b-d494-44bc-aa5d-23efc7cbd7a6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:35 crc kubenswrapper[4816]: I0316 00:10:35.061805 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-sh4ps" Mar 16 00:10:35 crc kubenswrapper[4816]: I0316 00:10:35.083954 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-mwhpz" Mar 16 00:10:35 crc kubenswrapper[4816]: I0316 00:10:35.109370 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-t8c6x"] Mar 16 00:10:35 crc kubenswrapper[4816]: I0316 00:10:35.111926 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-tgbjg"] Mar 16 00:10:35 crc kubenswrapper[4816]: I0316 00:10:35.120607 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-6r96v"] Mar 16 00:10:35 crc kubenswrapper[4816]: I0316 00:10:35.121801 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-rqxss"] Mar 16 00:10:35 crc kubenswrapper[4816]: I0316 00:10:35.129303 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 16 00:10:35 crc kubenswrapper[4816]: E0316 00:10:35.129533 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-16 00:10:35.629444919 +0000 UTC m=+228.725744872 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:35 crc kubenswrapper[4816]: I0316 00:10:35.129755 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/7c3e347f-464a-43f1-bf29-689bf81a28e6-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-sshl5\" (UID: \"7c3e347f-464a-43f1-bf29-689bf81a28e6\") " pod="openshift-authentication/oauth-openshift-558db77b4-sshl5" Mar 16 00:10:35 crc kubenswrapper[4816]: I0316 00:10:35.129849 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/a1a13dfd-7f8c-4a52-9ace-ffb14d3d91f2-socket-dir\") pod \"csi-hostpathplugin-trp9l\" (UID: \"a1a13dfd-7f8c-4a52-9ace-ffb14d3d91f2\") " pod="hostpath-provisioner/csi-hostpathplugin-trp9l" Mar 16 00:10:35 crc kubenswrapper[4816]: I0316 00:10:35.131901 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/7c3e347f-464a-43f1-bf29-689bf81a28e6-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-sshl5\" (UID: \"7c3e347f-464a-43f1-bf29-689bf81a28e6\") " pod="openshift-authentication/oauth-openshift-558db77b4-sshl5" Mar 16 00:10:35 crc kubenswrapper[4816]: I0316 00:10:35.132037 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/7c3e347f-464a-43f1-bf29-689bf81a28e6-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-sshl5\" (UID: \"7c3e347f-464a-43f1-bf29-689bf81a28e6\") " pod="openshift-authentication/oauth-openshift-558db77b4-sshl5" Mar 16 00:10:35 crc kubenswrapper[4816]: I0316 00:10:35.132103 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/7c3e347f-464a-43f1-bf29-689bf81a28e6-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-sshl5\" (UID: \"7c3e347f-464a-43f1-bf29-689bf81a28e6\") " pod="openshift-authentication/oauth-openshift-558db77b4-sshl5" Mar 16 00:10:35 crc kubenswrapper[4816]: I0316 00:10:35.132237 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9r7jk\" (UniqueName: \"kubernetes.io/projected/b155133b-d494-44bc-aa5d-23efc7cbd7a6-kube-api-access-9r7jk\") pod \"image-registry-697d97f7c8-ckvwn\" (UID: \"b155133b-d494-44bc-aa5d-23efc7cbd7a6\") " pod="openshift-image-registry/image-registry-697d97f7c8-ckvwn" Mar 16 00:10:35 crc kubenswrapper[4816]: I0316 00:10:35.132280 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/7c3e347f-464a-43f1-bf29-689bf81a28e6-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-sshl5\" (UID: \"7c3e347f-464a-43f1-bf29-689bf81a28e6\") " pod="openshift-authentication/oauth-openshift-558db77b4-sshl5" Mar 16 00:10:35 crc kubenswrapper[4816]: I0316 00:10:35.132303 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/fb4df3ac-df8b-4c9c-b7aa-dd9cd455a847-config-volume\") pod \"dns-default-npvts\" (UID: \"fb4df3ac-df8b-4c9c-b7aa-dd9cd455a847\") " pod="openshift-dns/dns-default-npvts" Mar 16 00:10:35 crc kubenswrapper[4816]: I0316 00:10:35.132319 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/fb4df3ac-df8b-4c9c-b7aa-dd9cd455a847-metrics-tls\") pod \"dns-default-npvts\" (UID: \"fb4df3ac-df8b-4c9c-b7aa-dd9cd455a847\") " pod="openshift-dns/dns-default-npvts" Mar 16 00:10:35 crc kubenswrapper[4816]: I0316 00:10:35.132386 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kgzpq\" (UniqueName: \"kubernetes.io/projected/fb4df3ac-df8b-4c9c-b7aa-dd9cd455a847-kube-api-access-kgzpq\") pod \"dns-default-npvts\" (UID: \"fb4df3ac-df8b-4c9c-b7aa-dd9cd455a847\") " pod="openshift-dns/dns-default-npvts" Mar 16 00:10:35 crc kubenswrapper[4816]: I0316 00:10:35.133540 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/7c3e347f-464a-43f1-bf29-689bf81a28e6-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-sshl5\" (UID: \"7c3e347f-464a-43f1-bf29-689bf81a28e6\") " pod="openshift-authentication/oauth-openshift-558db77b4-sshl5" Mar 16 00:10:35 crc kubenswrapper[4816]: I0316 00:10:35.133665 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/7c3e347f-464a-43f1-bf29-689bf81a28e6-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-sshl5\" (UID: \"7c3e347f-464a-43f1-bf29-689bf81a28e6\") " pod="openshift-authentication/oauth-openshift-558db77b4-sshl5" Mar 16 00:10:35 crc kubenswrapper[4816]: I0316 00:10:35.134625 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/7c3e347f-464a-43f1-bf29-689bf81a28e6-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-sshl5\" (UID: \"7c3e347f-464a-43f1-bf29-689bf81a28e6\") " pod="openshift-authentication/oauth-openshift-558db77b4-sshl5" Mar 16 00:10:35 crc kubenswrapper[4816]: I0316 00:10:35.135540 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/b155133b-d494-44bc-aa5d-23efc7cbd7a6-installation-pull-secrets\") pod \"image-registry-697d97f7c8-ckvwn\" (UID: \"b155133b-d494-44bc-aa5d-23efc7cbd7a6\") " pod="openshift-image-registry/image-registry-697d97f7c8-ckvwn" Mar 16 00:10:35 crc kubenswrapper[4816]: I0316 00:10:35.135706 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/b155133b-d494-44bc-aa5d-23efc7cbd7a6-ca-trust-extracted\") pod \"image-registry-697d97f7c8-ckvwn\" (UID: \"b155133b-d494-44bc-aa5d-23efc7cbd7a6\") " pod="openshift-image-registry/image-registry-697d97f7c8-ckvwn" Mar 16 00:10:35 crc kubenswrapper[4816]: I0316 00:10:35.135829 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/b155133b-d494-44bc-aa5d-23efc7cbd7a6-registry-certificates\") pod \"image-registry-697d97f7c8-ckvwn\" (UID: \"b155133b-d494-44bc-aa5d-23efc7cbd7a6\") " pod="openshift-image-registry/image-registry-697d97f7c8-ckvwn" Mar 16 00:10:35 crc kubenswrapper[4816]: I0316 00:10:35.135987 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7c3e347f-464a-43f1-bf29-689bf81a28e6-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-sshl5\" (UID: \"7c3e347f-464a-43f1-bf29-689bf81a28e6\") " pod="openshift-authentication/oauth-openshift-558db77b4-sshl5" Mar 16 00:10:35 crc kubenswrapper[4816]: I0316 00:10:35.136026 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zjls2\" (UniqueName: \"kubernetes.io/projected/63006c82-767f-4514-9d7e-5afd9bfe6e96-kube-api-access-zjls2\") pod \"machine-config-server-vkr88\" (UID: \"63006c82-767f-4514-9d7e-5afd9bfe6e96\") " pod="openshift-machine-config-operator/machine-config-server-vkr88" Mar 16 00:10:35 crc kubenswrapper[4816]: I0316 00:10:35.136199 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/7c3e347f-464a-43f1-bf29-689bf81a28e6-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-sshl5\" (UID: \"7c3e347f-464a-43f1-bf29-689bf81a28e6\") " pod="openshift-authentication/oauth-openshift-558db77b4-sshl5" Mar 16 00:10:35 crc kubenswrapper[4816]: I0316 00:10:35.136311 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/b155133b-d494-44bc-aa5d-23efc7cbd7a6-registry-tls\") pod \"image-registry-697d97f7c8-ckvwn\" (UID: \"b155133b-d494-44bc-aa5d-23efc7cbd7a6\") " pod="openshift-image-registry/image-registry-697d97f7c8-ckvwn" Mar 16 00:10:35 crc kubenswrapper[4816]: I0316 00:10:35.136427 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x4gx5\" (UniqueName: \"kubernetes.io/projected/7759829d-d50c-4dd7-8627-040ebf8f0e40-kube-api-access-x4gx5\") pod \"ingress-canary-2k6jt\" (UID: \"7759829d-d50c-4dd7-8627-040ebf8f0e40\") " pod="openshift-ingress-canary/ingress-canary-2k6jt" Mar 16 00:10:35 crc kubenswrapper[4816]: I0316 00:10:35.136787 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/b155133b-d494-44bc-aa5d-23efc7cbd7a6-registry-certificates\") pod \"image-registry-697d97f7c8-ckvwn\" (UID: \"b155133b-d494-44bc-aa5d-23efc7cbd7a6\") " pod="openshift-image-registry/image-registry-697d97f7c8-ckvwn" Mar 16 00:10:35 crc kubenswrapper[4816]: I0316 00:10:35.137609 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/a1a13dfd-7f8c-4a52-9ace-ffb14d3d91f2-registration-dir\") pod \"csi-hostpathplugin-trp9l\" (UID: \"a1a13dfd-7f8c-4a52-9ace-ffb14d3d91f2\") " pod="hostpath-provisioner/csi-hostpathplugin-trp9l" Mar 16 00:10:35 crc kubenswrapper[4816]: I0316 00:10:35.137712 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/7c3e347f-464a-43f1-bf29-689bf81a28e6-audit-policies\") pod \"oauth-openshift-558db77b4-sshl5\" (UID: \"7c3e347f-464a-43f1-bf29-689bf81a28e6\") " pod="openshift-authentication/oauth-openshift-558db77b4-sshl5" Mar 16 00:10:35 crc kubenswrapper[4816]: I0316 00:10:35.137746 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/7c3e347f-464a-43f1-bf29-689bf81a28e6-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-sshl5\" (UID: \"7c3e347f-464a-43f1-bf29-689bf81a28e6\") " pod="openshift-authentication/oauth-openshift-558db77b4-sshl5" Mar 16 00:10:35 crc kubenswrapper[4816]: I0316 00:10:35.137818 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kxdqt\" (UniqueName: \"kubernetes.io/projected/7c3e347f-464a-43f1-bf29-689bf81a28e6-kube-api-access-kxdqt\") pod \"oauth-openshift-558db77b4-sshl5\" (UID: \"7c3e347f-464a-43f1-bf29-689bf81a28e6\") " pod="openshift-authentication/oauth-openshift-558db77b4-sshl5" Mar 16 00:10:35 crc kubenswrapper[4816]: I0316 00:10:35.137848 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7759829d-d50c-4dd7-8627-040ebf8f0e40-cert\") pod \"ingress-canary-2k6jt\" (UID: \"7759829d-d50c-4dd7-8627-040ebf8f0e40\") " pod="openshift-ingress-canary/ingress-canary-2k6jt" Mar 16 00:10:35 crc kubenswrapper[4816]: I0316 00:10:35.137920 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/63006c82-767f-4514-9d7e-5afd9bfe6e96-certs\") pod \"machine-config-server-vkr88\" (UID: \"63006c82-767f-4514-9d7e-5afd9bfe6e96\") " pod="openshift-machine-config-operator/machine-config-server-vkr88" Mar 16 00:10:35 crc kubenswrapper[4816]: I0316 00:10:35.137972 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b155133b-d494-44bc-aa5d-23efc7cbd7a6-trusted-ca\") pod \"image-registry-697d97f7c8-ckvwn\" (UID: \"b155133b-d494-44bc-aa5d-23efc7cbd7a6\") " pod="openshift-image-registry/image-registry-697d97f7c8-ckvwn" Mar 16 00:10:35 crc kubenswrapper[4816]: I0316 00:10:35.138000 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/63006c82-767f-4514-9d7e-5afd9bfe6e96-node-bootstrap-token\") pod \"machine-config-server-vkr88\" (UID: \"63006c82-767f-4514-9d7e-5afd9bfe6e96\") " pod="openshift-machine-config-operator/machine-config-server-vkr88" Mar 16 00:10:35 crc kubenswrapper[4816]: I0316 00:10:35.138234 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-nsxl4" Mar 16 00:10:35 crc kubenswrapper[4816]: I0316 00:10:35.138620 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7c3e347f-464a-43f1-bf29-689bf81a28e6-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-sshl5\" (UID: \"7c3e347f-464a-43f1-bf29-689bf81a28e6\") " pod="openshift-authentication/oauth-openshift-558db77b4-sshl5" Mar 16 00:10:35 crc kubenswrapper[4816]: I0316 00:10:35.138631 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/b155133b-d494-44bc-aa5d-23efc7cbd7a6-ca-trust-extracted\") pod \"image-registry-697d97f7c8-ckvwn\" (UID: \"b155133b-d494-44bc-aa5d-23efc7cbd7a6\") " pod="openshift-image-registry/image-registry-697d97f7c8-ckvwn" Mar 16 00:10:35 crc kubenswrapper[4816]: I0316 00:10:35.138765 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/7c3e347f-464a-43f1-bf29-689bf81a28e6-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-sshl5\" (UID: \"7c3e347f-464a-43f1-bf29-689bf81a28e6\") " pod="openshift-authentication/oauth-openshift-558db77b4-sshl5" Mar 16 00:10:35 crc kubenswrapper[4816]: I0316 00:10:35.138792 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/a1a13dfd-7f8c-4a52-9ace-ffb14d3d91f2-csi-data-dir\") pod \"csi-hostpathplugin-trp9l\" (UID: \"a1a13dfd-7f8c-4a52-9ace-ffb14d3d91f2\") " pod="hostpath-provisioner/csi-hostpathplugin-trp9l" Mar 16 00:10:35 crc kubenswrapper[4816]: I0316 00:10:35.139357 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b155133b-d494-44bc-aa5d-23efc7cbd7a6-trusted-ca\") pod \"image-registry-697d97f7c8-ckvwn\" (UID: \"b155133b-d494-44bc-aa5d-23efc7cbd7a6\") " pod="openshift-image-registry/image-registry-697d97f7c8-ckvwn" Mar 16 00:10:35 crc kubenswrapper[4816]: I0316 00:10:35.139532 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/7c3e347f-464a-43f1-bf29-689bf81a28e6-audit-dir\") pod \"oauth-openshift-558db77b4-sshl5\" (UID: \"7c3e347f-464a-43f1-bf29-689bf81a28e6\") " pod="openshift-authentication/oauth-openshift-558db77b4-sshl5" Mar 16 00:10:35 crc kubenswrapper[4816]: I0316 00:10:35.139855 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/7c3e347f-464a-43f1-bf29-689bf81a28e6-audit-dir\") pod \"oauth-openshift-558db77b4-sshl5\" (UID: \"7c3e347f-464a-43f1-bf29-689bf81a28e6\") " pod="openshift-authentication/oauth-openshift-558db77b4-sshl5" Mar 16 00:10:35 crc kubenswrapper[4816]: I0316 00:10:35.139982 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/7c3e347f-464a-43f1-bf29-689bf81a28e6-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-sshl5\" (UID: \"7c3e347f-464a-43f1-bf29-689bf81a28e6\") " pod="openshift-authentication/oauth-openshift-558db77b4-sshl5" Mar 16 00:10:35 crc kubenswrapper[4816]: I0316 00:10:35.140127 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/b155133b-d494-44bc-aa5d-23efc7cbd7a6-bound-sa-token\") pod \"image-registry-697d97f7c8-ckvwn\" (UID: \"b155133b-d494-44bc-aa5d-23efc7cbd7a6\") " pod="openshift-image-registry/image-registry-697d97f7c8-ckvwn" Mar 16 00:10:35 crc kubenswrapper[4816]: I0316 00:10:35.140325 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ckvwn\" (UID: \"b155133b-d494-44bc-aa5d-23efc7cbd7a6\") " pod="openshift-image-registry/image-registry-697d97f7c8-ckvwn" Mar 16 00:10:35 crc kubenswrapper[4816]: I0316 00:10:35.140416 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/a1a13dfd-7f8c-4a52-9ace-ffb14d3d91f2-mountpoint-dir\") pod \"csi-hostpathplugin-trp9l\" (UID: \"a1a13dfd-7f8c-4a52-9ace-ffb14d3d91f2\") " pod="hostpath-provisioner/csi-hostpathplugin-trp9l" Mar 16 00:10:35 crc kubenswrapper[4816]: I0316 00:10:35.140680 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z5cmq\" (UniqueName: \"kubernetes.io/projected/a1a13dfd-7f8c-4a52-9ace-ffb14d3d91f2-kube-api-access-z5cmq\") pod \"csi-hostpathplugin-trp9l\" (UID: \"a1a13dfd-7f8c-4a52-9ace-ffb14d3d91f2\") " pod="hostpath-provisioner/csi-hostpathplugin-trp9l" Mar 16 00:10:35 crc kubenswrapper[4816]: E0316 00:10:35.140744 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-16 00:10:35.640731388 +0000 UTC m=+228.737031341 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ckvwn" (UID: "b155133b-d494-44bc-aa5d-23efc7cbd7a6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:35 crc kubenswrapper[4816]: I0316 00:10:35.140778 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/a1a13dfd-7f8c-4a52-9ace-ffb14d3d91f2-plugins-dir\") pod \"csi-hostpathplugin-trp9l\" (UID: \"a1a13dfd-7f8c-4a52-9ace-ffb14d3d91f2\") " pod="hostpath-provisioner/csi-hostpathplugin-trp9l" Mar 16 00:10:35 crc kubenswrapper[4816]: I0316 00:10:35.142078 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/7c3e347f-464a-43f1-bf29-689bf81a28e6-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-sshl5\" (UID: \"7c3e347f-464a-43f1-bf29-689bf81a28e6\") " pod="openshift-authentication/oauth-openshift-558db77b4-sshl5" Mar 16 00:10:35 crc kubenswrapper[4816]: I0316 00:10:35.143682 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/7c3e347f-464a-43f1-bf29-689bf81a28e6-audit-policies\") pod \"oauth-openshift-558db77b4-sshl5\" (UID: \"7c3e347f-464a-43f1-bf29-689bf81a28e6\") " pod="openshift-authentication/oauth-openshift-558db77b4-sshl5" Mar 16 00:10:35 crc kubenswrapper[4816]: I0316 00:10:35.144775 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/b155133b-d494-44bc-aa5d-23efc7cbd7a6-installation-pull-secrets\") pod \"image-registry-697d97f7c8-ckvwn\" (UID: \"b155133b-d494-44bc-aa5d-23efc7cbd7a6\") " pod="openshift-image-registry/image-registry-697d97f7c8-ckvwn" Mar 16 00:10:35 crc kubenswrapper[4816]: I0316 00:10:35.144810 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/7c3e347f-464a-43f1-bf29-689bf81a28e6-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-sshl5\" (UID: \"7c3e347f-464a-43f1-bf29-689bf81a28e6\") " pod="openshift-authentication/oauth-openshift-558db77b4-sshl5" Mar 16 00:10:35 crc kubenswrapper[4816]: I0316 00:10:35.145096 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/7c3e347f-464a-43f1-bf29-689bf81a28e6-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-sshl5\" (UID: \"7c3e347f-464a-43f1-bf29-689bf81a28e6\") " pod="openshift-authentication/oauth-openshift-558db77b4-sshl5" Mar 16 00:10:35 crc kubenswrapper[4816]: I0316 00:10:35.145784 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/7c3e347f-464a-43f1-bf29-689bf81a28e6-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-sshl5\" (UID: \"7c3e347f-464a-43f1-bf29-689bf81a28e6\") " pod="openshift-authentication/oauth-openshift-558db77b4-sshl5" Mar 16 00:10:35 crc kubenswrapper[4816]: I0316 00:10:35.145901 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/b155133b-d494-44bc-aa5d-23efc7cbd7a6-registry-tls\") pod \"image-registry-697d97f7c8-ckvwn\" (UID: \"b155133b-d494-44bc-aa5d-23efc7cbd7a6\") " pod="openshift-image-registry/image-registry-697d97f7c8-ckvwn" Mar 16 00:10:35 crc kubenswrapper[4816]: I0316 00:10:35.147048 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/7c3e347f-464a-43f1-bf29-689bf81a28e6-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-sshl5\" (UID: \"7c3e347f-464a-43f1-bf29-689bf81a28e6\") " pod="openshift-authentication/oauth-openshift-558db77b4-sshl5" Mar 16 00:10:35 crc kubenswrapper[4816]: I0316 00:10:35.147286 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/7c3e347f-464a-43f1-bf29-689bf81a28e6-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-sshl5\" (UID: \"7c3e347f-464a-43f1-bf29-689bf81a28e6\") " pod="openshift-authentication/oauth-openshift-558db77b4-sshl5" Mar 16 00:10:35 crc kubenswrapper[4816]: I0316 00:10:35.151370 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/7c3e347f-464a-43f1-bf29-689bf81a28e6-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-sshl5\" (UID: \"7c3e347f-464a-43f1-bf29-689bf81a28e6\") " pod="openshift-authentication/oauth-openshift-558db77b4-sshl5" Mar 16 00:10:35 crc kubenswrapper[4816]: I0316 00:10:35.151836 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/7c3e347f-464a-43f1-bf29-689bf81a28e6-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-sshl5\" (UID: \"7c3e347f-464a-43f1-bf29-689bf81a28e6\") " pod="openshift-authentication/oauth-openshift-558db77b4-sshl5" Mar 16 00:10:35 crc kubenswrapper[4816]: I0316 00:10:35.157958 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-jm8db" Mar 16 00:10:35 crc kubenswrapper[4816]: I0316 00:10:35.159105 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9r7jk\" (UniqueName: \"kubernetes.io/projected/b155133b-d494-44bc-aa5d-23efc7cbd7a6-kube-api-access-9r7jk\") pod \"image-registry-697d97f7c8-ckvwn\" (UID: \"b155133b-d494-44bc-aa5d-23efc7cbd7a6\") " pod="openshift-image-registry/image-registry-697d97f7c8-ckvwn" Mar 16 00:10:35 crc kubenswrapper[4816]: I0316 00:10:35.178032 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-fwkzt" Mar 16 00:10:35 crc kubenswrapper[4816]: I0316 00:10:35.205509 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kxdqt\" (UniqueName: \"kubernetes.io/projected/7c3e347f-464a-43f1-bf29-689bf81a28e6-kube-api-access-kxdqt\") pod \"oauth-openshift-558db77b4-sshl5\" (UID: \"7c3e347f-464a-43f1-bf29-689bf81a28e6\") " pod="openshift-authentication/oauth-openshift-558db77b4-sshl5" Mar 16 00:10:35 crc kubenswrapper[4816]: I0316 00:10:35.226290 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-8226q" Mar 16 00:10:35 crc kubenswrapper[4816]: I0316 00:10:35.227725 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/b155133b-d494-44bc-aa5d-23efc7cbd7a6-bound-sa-token\") pod \"image-registry-697d97f7c8-ckvwn\" (UID: \"b155133b-d494-44bc-aa5d-23efc7cbd7a6\") " pod="openshift-image-registry/image-registry-697d97f7c8-ckvwn" Mar 16 00:10:35 crc kubenswrapper[4816]: I0316 00:10:35.242158 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 16 00:10:35 crc kubenswrapper[4816]: I0316 00:10:35.242327 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/63006c82-767f-4514-9d7e-5afd9bfe6e96-certs\") pod \"machine-config-server-vkr88\" (UID: \"63006c82-767f-4514-9d7e-5afd9bfe6e96\") " pod="openshift-machine-config-operator/machine-config-server-vkr88" Mar 16 00:10:35 crc kubenswrapper[4816]: I0316 00:10:35.242356 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/63006c82-767f-4514-9d7e-5afd9bfe6e96-node-bootstrap-token\") pod \"machine-config-server-vkr88\" (UID: \"63006c82-767f-4514-9d7e-5afd9bfe6e96\") " pod="openshift-machine-config-operator/machine-config-server-vkr88" Mar 16 00:10:35 crc kubenswrapper[4816]: I0316 00:10:35.242379 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/a1a13dfd-7f8c-4a52-9ace-ffb14d3d91f2-csi-data-dir\") pod \"csi-hostpathplugin-trp9l\" (UID: \"a1a13dfd-7f8c-4a52-9ace-ffb14d3d91f2\") " pod="hostpath-provisioner/csi-hostpathplugin-trp9l" Mar 16 00:10:35 crc kubenswrapper[4816]: I0316 00:10:35.242415 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/a1a13dfd-7f8c-4a52-9ace-ffb14d3d91f2-mountpoint-dir\") pod \"csi-hostpathplugin-trp9l\" (UID: \"a1a13dfd-7f8c-4a52-9ace-ffb14d3d91f2\") " pod="hostpath-provisioner/csi-hostpathplugin-trp9l" Mar 16 00:10:35 crc kubenswrapper[4816]: I0316 00:10:35.242448 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z5cmq\" (UniqueName: \"kubernetes.io/projected/a1a13dfd-7f8c-4a52-9ace-ffb14d3d91f2-kube-api-access-z5cmq\") pod \"csi-hostpathplugin-trp9l\" (UID: \"a1a13dfd-7f8c-4a52-9ace-ffb14d3d91f2\") " pod="hostpath-provisioner/csi-hostpathplugin-trp9l" Mar 16 00:10:35 crc kubenswrapper[4816]: I0316 00:10:35.242476 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/a1a13dfd-7f8c-4a52-9ace-ffb14d3d91f2-plugins-dir\") pod \"csi-hostpathplugin-trp9l\" (UID: \"a1a13dfd-7f8c-4a52-9ace-ffb14d3d91f2\") " pod="hostpath-provisioner/csi-hostpathplugin-trp9l" Mar 16 00:10:35 crc kubenswrapper[4816]: I0316 00:10:35.242510 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/a1a13dfd-7f8c-4a52-9ace-ffb14d3d91f2-socket-dir\") pod \"csi-hostpathplugin-trp9l\" (UID: \"a1a13dfd-7f8c-4a52-9ace-ffb14d3d91f2\") " pod="hostpath-provisioner/csi-hostpathplugin-trp9l" Mar 16 00:10:35 crc kubenswrapper[4816]: I0316 00:10:35.242581 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/fb4df3ac-df8b-4c9c-b7aa-dd9cd455a847-config-volume\") pod \"dns-default-npvts\" (UID: \"fb4df3ac-df8b-4c9c-b7aa-dd9cd455a847\") " pod="openshift-dns/dns-default-npvts" Mar 16 00:10:35 crc kubenswrapper[4816]: I0316 00:10:35.242600 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/fb4df3ac-df8b-4c9c-b7aa-dd9cd455a847-metrics-tls\") pod \"dns-default-npvts\" (UID: \"fb4df3ac-df8b-4c9c-b7aa-dd9cd455a847\") " pod="openshift-dns/dns-default-npvts" Mar 16 00:10:35 crc kubenswrapper[4816]: I0316 00:10:35.242641 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kgzpq\" (UniqueName: \"kubernetes.io/projected/fb4df3ac-df8b-4c9c-b7aa-dd9cd455a847-kube-api-access-kgzpq\") pod \"dns-default-npvts\" (UID: \"fb4df3ac-df8b-4c9c-b7aa-dd9cd455a847\") " pod="openshift-dns/dns-default-npvts" Mar 16 00:10:35 crc kubenswrapper[4816]: I0316 00:10:35.242683 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zjls2\" (UniqueName: \"kubernetes.io/projected/63006c82-767f-4514-9d7e-5afd9bfe6e96-kube-api-access-zjls2\") pod \"machine-config-server-vkr88\" (UID: \"63006c82-767f-4514-9d7e-5afd9bfe6e96\") " pod="openshift-machine-config-operator/machine-config-server-vkr88" Mar 16 00:10:35 crc kubenswrapper[4816]: I0316 00:10:35.242718 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x4gx5\" (UniqueName: \"kubernetes.io/projected/7759829d-d50c-4dd7-8627-040ebf8f0e40-kube-api-access-x4gx5\") pod \"ingress-canary-2k6jt\" (UID: \"7759829d-d50c-4dd7-8627-040ebf8f0e40\") " pod="openshift-ingress-canary/ingress-canary-2k6jt" Mar 16 00:10:35 crc kubenswrapper[4816]: I0316 00:10:35.242741 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/a1a13dfd-7f8c-4a52-9ace-ffb14d3d91f2-registration-dir\") pod \"csi-hostpathplugin-trp9l\" (UID: \"a1a13dfd-7f8c-4a52-9ace-ffb14d3d91f2\") " pod="hostpath-provisioner/csi-hostpathplugin-trp9l" Mar 16 00:10:35 crc kubenswrapper[4816]: I0316 00:10:35.242769 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7759829d-d50c-4dd7-8627-040ebf8f0e40-cert\") pod \"ingress-canary-2k6jt\" (UID: \"7759829d-d50c-4dd7-8627-040ebf8f0e40\") " pod="openshift-ingress-canary/ingress-canary-2k6jt" Mar 16 00:10:35 crc kubenswrapper[4816]: I0316 00:10:35.243233 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/a1a13dfd-7f8c-4a52-9ace-ffb14d3d91f2-plugins-dir\") pod \"csi-hostpathplugin-trp9l\" (UID: \"a1a13dfd-7f8c-4a52-9ace-ffb14d3d91f2\") " pod="hostpath-provisioner/csi-hostpathplugin-trp9l" Mar 16 00:10:35 crc kubenswrapper[4816]: E0316 00:10:35.243348 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-16 00:10:35.743323139 +0000 UTC m=+228.839623092 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:35 crc kubenswrapper[4816]: I0316 00:10:35.245220 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/a1a13dfd-7f8c-4a52-9ace-ffb14d3d91f2-socket-dir\") pod \"csi-hostpathplugin-trp9l\" (UID: \"a1a13dfd-7f8c-4a52-9ace-ffb14d3d91f2\") " pod="hostpath-provisioner/csi-hostpathplugin-trp9l" Mar 16 00:10:35 crc kubenswrapper[4816]: I0316 00:10:35.246086 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/fb4df3ac-df8b-4c9c-b7aa-dd9cd455a847-config-volume\") pod \"dns-default-npvts\" (UID: \"fb4df3ac-df8b-4c9c-b7aa-dd9cd455a847\") " pod="openshift-dns/dns-default-npvts" Mar 16 00:10:35 crc kubenswrapper[4816]: I0316 00:10:35.247861 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/a1a13dfd-7f8c-4a52-9ace-ffb14d3d91f2-mountpoint-dir\") pod \"csi-hostpathplugin-trp9l\" (UID: \"a1a13dfd-7f8c-4a52-9ace-ffb14d3d91f2\") " pod="hostpath-provisioner/csi-hostpathplugin-trp9l" Mar 16 00:10:35 crc kubenswrapper[4816]: I0316 00:10:35.247951 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/a1a13dfd-7f8c-4a52-9ace-ffb14d3d91f2-csi-data-dir\") pod \"csi-hostpathplugin-trp9l\" (UID: \"a1a13dfd-7f8c-4a52-9ace-ffb14d3d91f2\") " pod="hostpath-provisioner/csi-hostpathplugin-trp9l" Mar 16 00:10:35 crc kubenswrapper[4816]: I0316 00:10:35.248050 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/a1a13dfd-7f8c-4a52-9ace-ffb14d3d91f2-registration-dir\") pod \"csi-hostpathplugin-trp9l\" (UID: \"a1a13dfd-7f8c-4a52-9ace-ffb14d3d91f2\") " pod="hostpath-provisioner/csi-hostpathplugin-trp9l" Mar 16 00:10:35 crc kubenswrapper[4816]: I0316 00:10:35.261199 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/63006c82-767f-4514-9d7e-5afd9bfe6e96-node-bootstrap-token\") pod \"machine-config-server-vkr88\" (UID: \"63006c82-767f-4514-9d7e-5afd9bfe6e96\") " pod="openshift-machine-config-operator/machine-config-server-vkr88" Mar 16 00:10:35 crc kubenswrapper[4816]: I0316 00:10:35.265936 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-mplx7"] Mar 16 00:10:35 crc kubenswrapper[4816]: I0316 00:10:35.265978 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-fk9l7"] Mar 16 00:10:35 crc kubenswrapper[4816]: I0316 00:10:35.268782 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/secret/63006c82-767f-4514-9d7e-5afd9bfe6e96-certs\") pod \"machine-config-server-vkr88\" (UID: \"63006c82-767f-4514-9d7e-5afd9bfe6e96\") " pod="openshift-machine-config-operator/machine-config-server-vkr88" Mar 16 00:10:35 crc kubenswrapper[4816]: I0316 00:10:35.273557 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7759829d-d50c-4dd7-8627-040ebf8f0e40-cert\") pod \"ingress-canary-2k6jt\" (UID: \"7759829d-d50c-4dd7-8627-040ebf8f0e40\") " pod="openshift-ingress-canary/ingress-canary-2k6jt" Mar 16 00:10:35 crc kubenswrapper[4816]: I0316 00:10:35.275882 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/fb4df3ac-df8b-4c9c-b7aa-dd9cd455a847-metrics-tls\") pod \"dns-default-npvts\" (UID: \"fb4df3ac-df8b-4c9c-b7aa-dd9cd455a847\") " pod="openshift-dns/dns-default-npvts" Mar 16 00:10:35 crc kubenswrapper[4816]: I0316 00:10:35.297020 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kgzpq\" (UniqueName: \"kubernetes.io/projected/fb4df3ac-df8b-4c9c-b7aa-dd9cd455a847-kube-api-access-kgzpq\") pod \"dns-default-npvts\" (UID: \"fb4df3ac-df8b-4c9c-b7aa-dd9cd455a847\") " pod="openshift-dns/dns-default-npvts" Mar 16 00:10:35 crc kubenswrapper[4816]: I0316 00:10:35.315728 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zjls2\" (UniqueName: \"kubernetes.io/projected/63006c82-767f-4514-9d7e-5afd9bfe6e96-kube-api-access-zjls2\") pod \"machine-config-server-vkr88\" (UID: \"63006c82-767f-4514-9d7e-5afd9bfe6e96\") " pod="openshift-machine-config-operator/machine-config-server-vkr88" Mar 16 00:10:35 crc kubenswrapper[4816]: I0316 00:10:35.320100 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-sshl5" Mar 16 00:10:35 crc kubenswrapper[4816]: I0316 00:10:35.344225 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x4gx5\" (UniqueName: \"kubernetes.io/projected/7759829d-d50c-4dd7-8627-040ebf8f0e40-kube-api-access-x4gx5\") pod \"ingress-canary-2k6jt\" (UID: \"7759829d-d50c-4dd7-8627-040ebf8f0e40\") " pod="openshift-ingress-canary/ingress-canary-2k6jt" Mar 16 00:10:35 crc kubenswrapper[4816]: I0316 00:10:35.358078 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ckvwn\" (UID: \"b155133b-d494-44bc-aa5d-23efc7cbd7a6\") " pod="openshift-image-registry/image-registry-697d97f7c8-ckvwn" Mar 16 00:10:35 crc kubenswrapper[4816]: E0316 00:10:35.358588 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-16 00:10:35.858575805 +0000 UTC m=+228.954875758 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ckvwn" (UID: "b155133b-d494-44bc-aa5d-23efc7cbd7a6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:35 crc kubenswrapper[4816]: I0316 00:10:35.365397 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z5cmq\" (UniqueName: \"kubernetes.io/projected/a1a13dfd-7f8c-4a52-9ace-ffb14d3d91f2-kube-api-access-z5cmq\") pod \"csi-hostpathplugin-trp9l\" (UID: \"a1a13dfd-7f8c-4a52-9ace-ffb14d3d91f2\") " pod="hostpath-provisioner/csi-hostpathplugin-trp9l" Mar 16 00:10:35 crc kubenswrapper[4816]: W0316 00:10:35.435990 4816 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod069c0b04_3302_488c_84fc_eeccac5fae9b.slice/crio-87bda69ca4cbf86419070ae0f95c798316c9febbcad86ed3ec9d714696e0e739 WatchSource:0}: Error finding container 87bda69ca4cbf86419070ae0f95c798316c9febbcad86ed3ec9d714696e0e739: Status 404 returned error can't find the container with id 87bda69ca4cbf86419070ae0f95c798316c9febbcad86ed3ec9d714696e0e739 Mar 16 00:10:35 crc kubenswrapper[4816]: I0316 00:10:35.458722 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 16 00:10:35 crc kubenswrapper[4816]: E0316 00:10:35.458854 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-16 00:10:35.958832663 +0000 UTC m=+229.055132616 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:35 crc kubenswrapper[4816]: I0316 00:10:35.459516 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ckvwn\" (UID: \"b155133b-d494-44bc-aa5d-23efc7cbd7a6\") " pod="openshift-image-registry/image-registry-697d97f7c8-ckvwn" Mar 16 00:10:35 crc kubenswrapper[4816]: E0316 00:10:35.459826 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-16 00:10:35.959817439 +0000 UTC m=+229.056117392 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ckvwn" (UID: "b155133b-d494-44bc-aa5d-23efc7cbd7a6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:35 crc kubenswrapper[4816]: I0316 00:10:35.533119 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29560330-44pts"] Mar 16 00:10:35 crc kubenswrapper[4816]: I0316 00:10:35.554682 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-2k6jt" Mar 16 00:10:35 crc kubenswrapper[4816]: I0316 00:10:35.561623 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-trp9l" Mar 16 00:10:35 crc kubenswrapper[4816]: I0316 00:10:35.562144 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 16 00:10:35 crc kubenswrapper[4816]: E0316 00:10:35.562386 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-16 00:10:36.062367109 +0000 UTC m=+229.158667062 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:35 crc kubenswrapper[4816]: I0316 00:10:35.562455 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ckvwn\" (UID: \"b155133b-d494-44bc-aa5d-23efc7cbd7a6\") " pod="openshift-image-registry/image-registry-697d97f7c8-ckvwn" Mar 16 00:10:35 crc kubenswrapper[4816]: E0316 00:10:35.562767 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-16 00:10:36.06276079 +0000 UTC m=+229.159060743 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ckvwn" (UID: "b155133b-d494-44bc-aa5d-23efc7cbd7a6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:35 crc kubenswrapper[4816]: I0316 00:10:35.580983 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-npvts" Mar 16 00:10:35 crc kubenswrapper[4816]: I0316 00:10:35.587773 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-vkr88" Mar 16 00:10:35 crc kubenswrapper[4816]: I0316 00:10:35.668304 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 16 00:10:35 crc kubenswrapper[4816]: E0316 00:10:35.668712 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-16 00:10:36.168694463 +0000 UTC m=+229.264994416 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:35 crc kubenswrapper[4816]: I0316 00:10:35.693985 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-pruner-29560320-s9q72"] Mar 16 00:10:35 crc kubenswrapper[4816]: I0316 00:10:35.702306 4816 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 16 00:10:35 crc kubenswrapper[4816]: I0316 00:10:35.734442 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-tv2n7"] Mar 16 00:10:35 crc kubenswrapper[4816]: I0316 00:10:35.771088 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ckvwn\" (UID: \"b155133b-d494-44bc-aa5d-23efc7cbd7a6\") " pod="openshift-image-registry/image-registry-697d97f7c8-ckvwn" Mar 16 00:10:35 crc kubenswrapper[4816]: E0316 00:10:35.771517 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-16 00:10:36.271502659 +0000 UTC m=+229.367802602 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ckvwn" (UID: "b155133b-d494-44bc-aa5d-23efc7cbd7a6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:35 crc kubenswrapper[4816]: I0316 00:10:35.793852 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-n7fkv" event={"ID":"c1674a73-a65c-4a8d-9dc5-af576a7af7d4","Type":"ContainerStarted","Data":"7c6c500cfb28dc83d59d87853219a73cb22b2472353d3dcec62ec2379a608552"} Mar 16 00:10:35 crc kubenswrapper[4816]: I0316 00:10:35.800567 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-fk9l7" event={"ID":"069c0b04-3302-488c-84fc-eeccac5fae9b","Type":"ContainerStarted","Data":"87bda69ca4cbf86419070ae0f95c798316c9febbcad86ed3ec9d714696e0e739"} Mar 16 00:10:35 crc kubenswrapper[4816]: I0316 00:10:35.813162 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-d9j8j" event={"ID":"1d5466ab-a589-4f7e-ae89-2f494b10f6b1","Type":"ContainerStarted","Data":"8bd92ab2e8746013ff96fbb3362f4a912a98fe884156f1b95b8704505ab4fe1a"} Mar 16 00:10:35 crc kubenswrapper[4816]: I0316 00:10:35.816484 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-d4vrm"] Mar 16 00:10:35 crc kubenswrapper[4816]: I0316 00:10:35.818010 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-t8c6x" event={"ID":"93dadffe-0353-4301-bb97-31b034d3dc64","Type":"ContainerStarted","Data":"974b3e2b70f0e27a9b647bc8e391de77e5ca6e17e53d98fb67c7dcd79bccb304"} Mar 16 00:10:35 crc kubenswrapper[4816]: I0316 00:10:35.826139 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-nnqsw" event={"ID":"32cac2d0-56f0-4ba8-86dc-9b57c0fcc11c","Type":"ContainerStarted","Data":"a0d10d1111e7e2e77d52b5e27988eadcbe0b5acc6c5de4dfdc0a8578af7ebd49"} Mar 16 00:10:35 crc kubenswrapper[4816]: I0316 00:10:35.827624 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-6r96v" event={"ID":"044562bd-df74-47fa-bc8d-1c652233e9c5","Type":"ContainerStarted","Data":"8603120fd31cddbb1af5cb78a46694e27d3663732e21a55abf708208a435fb18"} Mar 16 00:10:35 crc kubenswrapper[4816]: I0316 00:10:35.834246 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-rqxss" event={"ID":"005900fa-b395-4c1c-8e62-8e975bd0393c","Type":"ContainerStarted","Data":"0529b06f25e6c243657457dfefc2530ec2a164ac5044d8a511ecbf34234991c8"} Mar 16 00:10:35 crc kubenswrapper[4816]: I0316 00:10:35.848708 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-5rr7c"] Mar 16 00:10:35 crc kubenswrapper[4816]: I0316 00:10:35.850296 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-9xv4p" event={"ID":"1306b657-0022-435d-bb72-793f1c1a106b","Type":"ContainerStarted","Data":"064af4bec576ab72311cb9c881471458180983ba55703837df4cc30019c61214"} Mar 16 00:10:35 crc kubenswrapper[4816]: I0316 00:10:35.852387 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-tgbjg" event={"ID":"34f93b2b-cc36-4965-992c-825bf2595e1e","Type":"ContainerStarted","Data":"e4420a97b4bb6be4d17b075ce1b2cb37e8413a0733d1148eec85de090c917bf4"} Mar 16 00:10:35 crc kubenswrapper[4816]: I0316 00:10:35.853462 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-mplx7" event={"ID":"7442ef1b-27ea-4166-8457-5332c4c8f363","Type":"ContainerStarted","Data":"73c6daa7fcb97ac560caea2fc845a1867ae5d14edd93f8da77d854675d386c28"} Mar 16 00:10:35 crc kubenswrapper[4816]: I0316 00:10:35.864617 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29560330-44pts" event={"ID":"55e76e8f-7d69-4f55-81f8-45c9c612876b","Type":"ContainerStarted","Data":"14379482594ebf801c25583d0aab03c78f3555265f22f25f8cbeb498177ecef2"} Mar 16 00:10:35 crc kubenswrapper[4816]: I0316 00:10:35.866830 4816 generic.go:334] "Generic (PLEG): container finished" podID="dced2102-9fd0-4300-9e0a-35d915f1caad" containerID="fe139286db47fd751017f6a262d773b9fb23cd2d2ec4d8ec9a6e456cf554930a" exitCode=0 Mar 16 00:10:35 crc kubenswrapper[4816]: I0316 00:10:35.866974 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zrq8d" event={"ID":"dced2102-9fd0-4300-9e0a-35d915f1caad","Type":"ContainerDied","Data":"fe139286db47fd751017f6a262d773b9fb23cd2d2ec4d8ec9a6e456cf554930a"} Mar 16 00:10:35 crc kubenswrapper[4816]: I0316 00:10:35.868639 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-v8zx4" event={"ID":"b98dcc1e-4c4b-47eb-9ddf-59a138f94247","Type":"ContainerStarted","Data":"eeb77bec78a64a33e16b0e2cefa3b8ccabc03a84fa07e1289532500e2736e77c"} Mar 16 00:10:35 crc kubenswrapper[4816]: I0316 00:10:35.873053 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-s8mgz" event={"ID":"5e41d768-3ed4-4760-a0d5-4308d7b13379","Type":"ContainerStarted","Data":"5da8a9034d9756144255cb527badb2e6f547c6ea514e0a549acc6e784382f799"} Mar 16 00:10:35 crc kubenswrapper[4816]: I0316 00:10:35.873509 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 16 00:10:35 crc kubenswrapper[4816]: E0316 00:10:35.873846 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-16 00:10:36.373832323 +0000 UTC m=+229.470132276 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:35 crc kubenswrapper[4816]: I0316 00:10:35.884831 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-q9xc9"] Mar 16 00:10:35 crc kubenswrapper[4816]: I0316 00:10:35.976304 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ckvwn\" (UID: \"b155133b-d494-44bc-aa5d-23efc7cbd7a6\") " pod="openshift-image-registry/image-registry-697d97f7c8-ckvwn" Mar 16 00:10:35 crc kubenswrapper[4816]: E0316 00:10:35.977411 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-16 00:10:36.477396071 +0000 UTC m=+229.573696024 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ckvwn" (UID: "b155133b-d494-44bc-aa5d-23efc7cbd7a6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:36 crc kubenswrapper[4816]: W0316 00:10:36.037576 4816 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4f90d894_17c6_4800_a438_737fe8619e01.slice/crio-1b400492d5ff6227c838ac428fcdaf391667d35c7a51faf19494d2104b095289 WatchSource:0}: Error finding container 1b400492d5ff6227c838ac428fcdaf391667d35c7a51faf19494d2104b095289: Status 404 returned error can't find the container with id 1b400492d5ff6227c838ac428fcdaf391667d35c7a51faf19494d2104b095289 Mar 16 00:10:36 crc kubenswrapper[4816]: I0316 00:10:36.077276 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 16 00:10:36 crc kubenswrapper[4816]: E0316 00:10:36.077771 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-16 00:10:36.577751941 +0000 UTC m=+229.674051894 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:36 crc kubenswrapper[4816]: I0316 00:10:36.179334 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ckvwn\" (UID: \"b155133b-d494-44bc-aa5d-23efc7cbd7a6\") " pod="openshift-image-registry/image-registry-697d97f7c8-ckvwn" Mar 16 00:10:36 crc kubenswrapper[4816]: E0316 00:10:36.179641 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-16 00:10:36.679625342 +0000 UTC m=+229.775925295 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ckvwn" (UID: "b155133b-d494-44bc-aa5d-23efc7cbd7a6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:36 crc kubenswrapper[4816]: I0316 00:10:36.283693 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 16 00:10:36 crc kubenswrapper[4816]: E0316 00:10:36.284327 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-16 00:10:36.784313071 +0000 UTC m=+229.880613024 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:36 crc kubenswrapper[4816]: I0316 00:10:36.290890 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-4t6gm"] Mar 16 00:10:36 crc kubenswrapper[4816]: W0316 00:10:36.317026 4816 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf4bdfe91_48ef_4a44_99ba_d7ab90df9ec0.slice/crio-e7512c0843d9774277717bee67fedbfc19a4fbf662ac7ce19fc4a721573a25b6 WatchSource:0}: Error finding container e7512c0843d9774277717bee67fedbfc19a4fbf662ac7ce19fc4a721573a25b6: Status 404 returned error can't find the container with id e7512c0843d9774277717bee67fedbfc19a4fbf662ac7ce19fc4a721573a25b6 Mar 16 00:10:36 crc kubenswrapper[4816]: I0316 00:10:36.354837 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-zn6w7"] Mar 16 00:10:36 crc kubenswrapper[4816]: I0316 00:10:36.358926 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-sh4ps"] Mar 16 00:10:36 crc kubenswrapper[4816]: I0316 00:10:36.372862 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-l648b"] Mar 16 00:10:36 crc kubenswrapper[4816]: I0316 00:10:36.375371 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-gtwmf"] Mar 16 00:10:36 crc kubenswrapper[4816]: I0316 00:10:36.385404 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ckvwn\" (UID: \"b155133b-d494-44bc-aa5d-23efc7cbd7a6\") " pod="openshift-image-registry/image-registry-697d97f7c8-ckvwn" Mar 16 00:10:36 crc kubenswrapper[4816]: E0316 00:10:36.385776 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-16 00:10:36.88576155 +0000 UTC m=+229.982061503 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ckvwn" (UID: "b155133b-d494-44bc-aa5d-23efc7cbd7a6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:36 crc kubenswrapper[4816]: I0316 00:10:36.386288 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-vvdz2"] Mar 16 00:10:36 crc kubenswrapper[4816]: I0316 00:10:36.486796 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 16 00:10:36 crc kubenswrapper[4816]: E0316 00:10:36.486976 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-16 00:10:36.986950343 +0000 UTC m=+230.083250296 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:36 crc kubenswrapper[4816]: I0316 00:10:36.487523 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ckvwn\" (UID: \"b155133b-d494-44bc-aa5d-23efc7cbd7a6\") " pod="openshift-image-registry/image-registry-697d97f7c8-ckvwn" Mar 16 00:10:36 crc kubenswrapper[4816]: E0316 00:10:36.487886 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-16 00:10:36.987870248 +0000 UTC m=+230.084170201 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ckvwn" (UID: "b155133b-d494-44bc-aa5d-23efc7cbd7a6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:36 crc kubenswrapper[4816]: W0316 00:10:36.515181 4816 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod064b42ee_720b_456c_8ffe_a247f827befc.slice/crio-435f4c94388020c089f5ba1a78632c86095286e9141f52726e93fc0a6363522c WatchSource:0}: Error finding container 435f4c94388020c089f5ba1a78632c86095286e9141f52726e93fc0a6363522c: Status 404 returned error can't find the container with id 435f4c94388020c089f5ba1a78632c86095286e9141f52726e93fc0a6363522c Mar 16 00:10:36 crc kubenswrapper[4816]: W0316 00:10:36.516178 4816 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod29bcfe72_6ef1_4087_9feb_787fdba3d2d7.slice/crio-3c7d92de283b16898a891bb4b37b3cd54bf360ea0cbf78310da58d80dac3aa50 WatchSource:0}: Error finding container 3c7d92de283b16898a891bb4b37b3cd54bf360ea0cbf78310da58d80dac3aa50: Status 404 returned error can't find the container with id 3c7d92de283b16898a891bb4b37b3cd54bf360ea0cbf78310da58d80dac3aa50 Mar 16 00:10:36 crc kubenswrapper[4816]: W0316 00:10:36.537035 4816 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod79ec9746_96c0_4fcd_b367_a42b6950145a.slice/crio-d3c8b5ab4cfd90d2004818a74572fcd44139f28307a477559adc275564959dd6 WatchSource:0}: Error finding container d3c8b5ab4cfd90d2004818a74572fcd44139f28307a477559adc275564959dd6: Status 404 returned error can't find the container with id d3c8b5ab4cfd90d2004818a74572fcd44139f28307a477559adc275564959dd6 Mar 16 00:10:36 crc kubenswrapper[4816]: I0316 00:10:36.588975 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 16 00:10:36 crc kubenswrapper[4816]: E0316 00:10:36.589051 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-16 00:10:37.08902991 +0000 UTC m=+230.185329863 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:36 crc kubenswrapper[4816]: I0316 00:10:36.590679 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ckvwn\" (UID: \"b155133b-d494-44bc-aa5d-23efc7cbd7a6\") " pod="openshift-image-registry/image-registry-697d97f7c8-ckvwn" Mar 16 00:10:36 crc kubenswrapper[4816]: E0316 00:10:36.592302 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-16 00:10:37.092268929 +0000 UTC m=+230.188568882 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ckvwn" (UID: "b155133b-d494-44bc-aa5d-23efc7cbd7a6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:36 crc kubenswrapper[4816]: I0316 00:10:36.695540 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 16 00:10:36 crc kubenswrapper[4816]: E0316 00:10:36.698233 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-16 00:10:37.198204661 +0000 UTC m=+230.294504614 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:36 crc kubenswrapper[4816]: I0316 00:10:36.698323 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29560320-4hk5d"] Mar 16 00:10:36 crc kubenswrapper[4816]: I0316 00:10:36.701943 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ckvwn\" (UID: \"b155133b-d494-44bc-aa5d-23efc7cbd7a6\") " pod="openshift-image-registry/image-registry-697d97f7c8-ckvwn" Mar 16 00:10:36 crc kubenswrapper[4816]: E0316 00:10:36.703097 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-16 00:10:37.203080994 +0000 UTC m=+230.299380947 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ckvwn" (UID: "b155133b-d494-44bc-aa5d-23efc7cbd7a6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:36 crc kubenswrapper[4816]: I0316 00:10:36.733644 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-8tt9t"] Mar 16 00:10:36 crc kubenswrapper[4816]: I0316 00:10:36.740895 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-ll5r8"] Mar 16 00:10:36 crc kubenswrapper[4816]: I0316 00:10:36.747315 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-6nkm6"] Mar 16 00:10:36 crc kubenswrapper[4816]: I0316 00:10:36.750697 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-mwhpz"] Mar 16 00:10:36 crc kubenswrapper[4816]: I0316 00:10:36.756105 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-nsxl4"] Mar 16 00:10:36 crc kubenswrapper[4816]: I0316 00:10:36.804265 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 16 00:10:36 crc kubenswrapper[4816]: E0316 00:10:36.804583 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-16 00:10:37.304534154 +0000 UTC m=+230.400834107 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:36 crc kubenswrapper[4816]: I0316 00:10:36.828476 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-jm8db"] Mar 16 00:10:36 crc kubenswrapper[4816]: I0316 00:10:36.834288 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd-operator/etcd-operator-b45778765-2l7nk" podStartSLOduration=178.834265196 podStartE2EDuration="2m58.834265196s" podCreationTimestamp="2026-03-16 00:07:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-16 00:10:36.831738507 +0000 UTC m=+229.928038460" watchObservedRunningTime="2026-03-16 00:10:36.834265196 +0000 UTC m=+229.930565149" Mar 16 00:10:36 crc kubenswrapper[4816]: I0316 00:10:36.890345 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-4t6gm" event={"ID":"f4bdfe91-48ef-4a44-99ba-d7ab90df9ec0","Type":"ContainerStarted","Data":"e7512c0843d9774277717bee67fedbfc19a4fbf662ac7ce19fc4a721573a25b6"} Mar 16 00:10:36 crc kubenswrapper[4816]: I0316 00:10:36.895996 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-tgbjg" event={"ID":"34f93b2b-cc36-4965-992c-825bf2595e1e","Type":"ContainerStarted","Data":"0102ebae46c1fc8e99ff0cb4e50c57fe46507b471478556f3430169c1de5acb0"} Mar 16 00:10:36 crc kubenswrapper[4816]: I0316 00:10:36.897801 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-s8mgz" event={"ID":"5e41d768-3ed4-4760-a0d5-4308d7b13379","Type":"ContainerStarted","Data":"ccf54344c9cf242f18daa95790d42c562fbe2956ec54e050d405db668a935bf5"} Mar 16 00:10:36 crc kubenswrapper[4816]: I0316 00:10:36.902862 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-n7fkv" event={"ID":"c1674a73-a65c-4a8d-9dc5-af576a7af7d4","Type":"ContainerStarted","Data":"2034ace62ae33c1ff5aa46586892fcaa5167fe1712a3333f7b2e270628a77021"} Mar 16 00:10:36 crc kubenswrapper[4816]: I0316 00:10:36.906099 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-sshl5"] Mar 16 00:10:36 crc kubenswrapper[4816]: I0316 00:10:36.906194 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ckvwn\" (UID: \"b155133b-d494-44bc-aa5d-23efc7cbd7a6\") " pod="openshift-image-registry/image-registry-697d97f7c8-ckvwn" Mar 16 00:10:36 crc kubenswrapper[4816]: E0316 00:10:36.906525 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-16 00:10:37.406511498 +0000 UTC m=+230.502811461 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ckvwn" (UID: "b155133b-d494-44bc-aa5d-23efc7cbd7a6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:36 crc kubenswrapper[4816]: I0316 00:10:36.908947 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-fnmb9" event={"ID":"171b00f7-f7cf-41b3-bffd-11ceeb9f2182","Type":"ContainerStarted","Data":"74ca3ab68b5db7b7155d0d401ca028d9f990f240b0723da3ddb90c76801efeca"} Mar 16 00:10:36 crc kubenswrapper[4816]: I0316 00:10:36.910066 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-nnqsw" event={"ID":"32cac2d0-56f0-4ba8-86dc-9b57c0fcc11c","Type":"ContainerStarted","Data":"b3a7ebd80e41e65628b937f27a9a6d0026e0665dfdb1a3dff17f4765a374b5cb"} Mar 16 00:10:36 crc kubenswrapper[4816]: I0316 00:10:36.914839 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29560320-4hk5d" event={"ID":"9397185e-a9e3-4ef4-b0be-d9dc9208adff","Type":"ContainerStarted","Data":"b19b5574ead1cf818c519a7ffb8ef773b81e380296fd94d88cb6d44a3be77066"} Mar 16 00:10:36 crc kubenswrapper[4816]: I0316 00:10:36.916576 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-l648b" event={"ID":"ef3a7303-57a8-461f-86c1-fd3f7882e93b","Type":"ContainerStarted","Data":"3d4fc8466c3535a388e50e076cc43357b3c1637c0fa69064b15bbb7d8d979495"} Mar 16 00:10:36 crc kubenswrapper[4816]: I0316 00:10:36.922204 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-t8c6x" event={"ID":"93dadffe-0353-4301-bb97-31b034d3dc64","Type":"ContainerStarted","Data":"165ac52589fbf987de83ac85b0f5daf2f38695714d76c0365a37f99757d92693"} Mar 16 00:10:36 crc kubenswrapper[4816]: I0316 00:10:36.925918 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-tv2n7" event={"ID":"59c840a8-f288-44ed-83d3-34d47041c6c6","Type":"ContainerStarted","Data":"c46a7076608f889c8e30b77b33715ed49c92e64799e40fe88b9f99f6e980f6a5"} Mar 16 00:10:36 crc kubenswrapper[4816]: I0316 00:10:36.925966 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-tv2n7" event={"ID":"59c840a8-f288-44ed-83d3-34d47041c6c6","Type":"ContainerStarted","Data":"360f090f6a27a9d9ebb782602e54104c845a3d5e91127b115ef7d468e384ebfe"} Mar 16 00:10:36 crc kubenswrapper[4816]: I0316 00:10:36.927043 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-879f6c89f-tv2n7" Mar 16 00:10:36 crc kubenswrapper[4816]: I0316 00:10:36.927935 4816 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-tv2n7 container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.25:8443/healthz\": dial tcp 10.217.0.25:8443: connect: connection refused" start-of-body= Mar 16 00:10:36 crc kubenswrapper[4816]: I0316 00:10:36.928175 4816 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-tv2n7" podUID="59c840a8-f288-44ed-83d3-34d47041c6c6" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.25:8443/healthz\": dial tcp 10.217.0.25:8443: connect: connection refused" Mar 16 00:10:36 crc kubenswrapper[4816]: I0316 00:10:36.928217 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-sh4ps" event={"ID":"0a98bca9-38d3-4382-a6d6-8410170f7d81","Type":"ContainerStarted","Data":"459a22c156ab861c35576c1fb3628cbc341c3b5347eb2308c3393d558f2a1ab7"} Mar 16 00:10:36 crc kubenswrapper[4816]: I0316 00:10:36.936290 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-9xv4p" event={"ID":"1306b657-0022-435d-bb72-793f1c1a106b","Type":"ContainerStarted","Data":"b3b4ee5b25b557320498976e779daeaaad18e40f9b7a18ae9011a6748fba579b"} Mar 16 00:10:36 crc kubenswrapper[4816]: I0316 00:10:36.944437 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-6r96v" event={"ID":"044562bd-df74-47fa-bc8d-1c652233e9c5","Type":"ContainerStarted","Data":"feb3e115aaa958caf7f3f1e53cfdcc9b39f697220b3b5b82172c36b393279bde"} Mar 16 00:10:36 crc kubenswrapper[4816]: I0316 00:10:36.945787 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-2k6jt"] Mar 16 00:10:36 crc kubenswrapper[4816]: I0316 00:10:36.947717 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-gvk75" event={"ID":"681ca8e4-f909-4e8b-9f35-5ab8ca382e44","Type":"ContainerStarted","Data":"6af71f7281740ba22cc030b932f6be0867123264435c44d5f5ba540286f918d6"} Mar 16 00:10:36 crc kubenswrapper[4816]: I0316 00:10:36.963862 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-fk9l7" event={"ID":"069c0b04-3302-488c-84fc-eeccac5fae9b","Type":"ContainerStarted","Data":"9f73049d4c6407e7f4bf7d9a09acf185cda5e22663da389d70890266ca82af24"} Mar 16 00:10:36 crc kubenswrapper[4816]: I0316 00:10:36.972603 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-vvdz2" event={"ID":"79ec9746-96c0-4fcd-b367-a42b6950145a","Type":"ContainerStarted","Data":"d3c8b5ab4cfd90d2004818a74572fcd44139f28307a477559adc275564959dd6"} Mar 16 00:10:36 crc kubenswrapper[4816]: I0316 00:10:36.980880 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-zn6w7" event={"ID":"064b42ee-720b-456c-8ffe-a247f827befc","Type":"ContainerStarted","Data":"435f4c94388020c089f5ba1a78632c86095286e9141f52726e93fc0a6363522c"} Mar 16 00:10:36 crc kubenswrapper[4816]: I0316 00:10:36.981808 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-5rr7c" event={"ID":"0ec3cdc0-f024-43cf-b520-7d2437e0f8df","Type":"ContainerStarted","Data":"38ef432004d9ccef99538782b144f5189443b51f7377a192547af787185ab274"} Mar 16 00:10:36 crc kubenswrapper[4816]: I0316 00:10:36.982469 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-d4vrm" event={"ID":"4f90d894-17c6-4800-a438-737fe8619e01","Type":"ContainerStarted","Data":"1b400492d5ff6227c838ac428fcdaf391667d35c7a51faf19494d2104b095289"} Mar 16 00:10:36 crc kubenswrapper[4816]: I0316 00:10:36.983878 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-8226q"] Mar 16 00:10:36 crc kubenswrapper[4816]: I0316 00:10:36.986239 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-vkr88" event={"ID":"63006c82-767f-4514-9d7e-5afd9bfe6e96","Type":"ContainerStarted","Data":"defc7bad5c4bbc8a32133730e52ae3ffb20ab00ebd6954c4c5771830720b2d0c"} Mar 16 00:10:36 crc kubenswrapper[4816]: I0316 00:10:36.986280 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-vkr88" event={"ID":"63006c82-767f-4514-9d7e-5afd9bfe6e96","Type":"ContainerStarted","Data":"77a12cf7d9180afb497655c7adf9f2eba68426ea31f62166c80397d342b69cff"} Mar 16 00:10:36 crc kubenswrapper[4816]: I0316 00:10:36.990727 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-pruner-29560320-s9q72" event={"ID":"9fc59286-0388-4519-afc7-f2c8cf80ab40","Type":"ContainerStarted","Data":"a30805e487fac9e751dab1510445d1b512d8b7784f8e73df1f67f72887178e24"} Mar 16 00:10:36 crc kubenswrapper[4816]: I0316 00:10:36.990768 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-pruner-29560320-s9q72" event={"ID":"9fc59286-0388-4519-afc7-f2c8cf80ab40","Type":"ContainerStarted","Data":"469ef439f1bc4e49165115c6fecd0f6feec675c1f680294bca4301ee3520daee"} Mar 16 00:10:36 crc kubenswrapper[4816]: I0316 00:10:36.993952 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-fwkzt"] Mar 16 00:10:36 crc kubenswrapper[4816]: I0316 00:10:36.994625 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-rqxss" event={"ID":"005900fa-b395-4c1c-8e62-8e975bd0393c","Type":"ContainerStarted","Data":"aa45c97bed9092558ce91ff7b080b67a6b2b1b3a899958cb8b70ee721ff99937"} Mar 16 00:10:36 crc kubenswrapper[4816]: I0316 00:10:36.996279 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-q9xc9" event={"ID":"4cc341f9-55c7-4bce-a0e3-24df68ca7f0e","Type":"ContainerStarted","Data":"28227b88f3697e660f1b439e8a912712575523aba2448c2ce551decede39b872"} Mar 16 00:10:37 crc kubenswrapper[4816]: I0316 00:10:37.001898 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-gtwmf" event={"ID":"29bcfe72-6ef1-4087-9feb-787fdba3d2d7","Type":"ContainerStarted","Data":"3c7d92de283b16898a891bb4b37b3cd54bf360ea0cbf78310da58d80dac3aa50"} Mar 16 00:10:37 crc kubenswrapper[4816]: I0316 00:10:37.003048 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-d9j8j" event={"ID":"1d5466ab-a589-4f7e-ae89-2f494b10f6b1","Type":"ContainerStarted","Data":"e90fdfac87f05e45b64d63ce5cb4d5902fbd18d9c1d580577069351527db0c29"} Mar 16 00:10:37 crc kubenswrapper[4816]: I0316 00:10:37.003400 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-d9j8j" Mar 16 00:10:37 crc kubenswrapper[4816]: I0316 00:10:37.005467 4816 patch_prober.go:28] interesting pod/route-controller-manager-6576b87f9c-d9j8j container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.11:8443/healthz\": dial tcp 10.217.0.11:8443: connect: connection refused" start-of-body= Mar 16 00:10:37 crc kubenswrapper[4816]: I0316 00:10:37.005492 4816 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-d9j8j" podUID="1d5466ab-a589-4f7e-ae89-2f494b10f6b1" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.11:8443/healthz\": dial tcp 10.217.0.11:8443: connect: connection refused" Mar 16 00:10:37 crc kubenswrapper[4816]: I0316 00:10:37.005532 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-s8mgz" podStartSLOduration=179.005522821 podStartE2EDuration="2m59.005522821s" podCreationTimestamp="2026-03-16 00:07:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-16 00:10:37.003071454 +0000 UTC m=+230.099371407" watchObservedRunningTime="2026-03-16 00:10:37.005522821 +0000 UTC m=+230.101822774" Mar 16 00:10:37 crc kubenswrapper[4816]: I0316 00:10:37.006345 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-v8zx4" event={"ID":"b98dcc1e-4c4b-47eb-9ddf-59a138f94247","Type":"ContainerStarted","Data":"3649dcfc95a4cd8b92968a789c1a118f74cc3c2308abfaa832728996a26a704c"} Mar 16 00:10:37 crc kubenswrapper[4816]: I0316 00:10:37.006702 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 16 00:10:37 crc kubenswrapper[4816]: E0316 00:10:37.007170 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-16 00:10:37.507114504 +0000 UTC m=+230.603414527 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:37 crc kubenswrapper[4816]: I0316 00:10:37.008605 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ckvwn\" (UID: \"b155133b-d494-44bc-aa5d-23efc7cbd7a6\") " pod="openshift-image-registry/image-registry-697d97f7c8-ckvwn" Mar 16 00:10:37 crc kubenswrapper[4816]: E0316 00:10:37.012719 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-16 00:10:37.512703297 +0000 UTC m=+230.609003360 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ckvwn" (UID: "b155133b-d494-44bc-aa5d-23efc7cbd7a6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:37 crc kubenswrapper[4816]: I0316 00:10:37.019426 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-npvts"] Mar 16 00:10:37 crc kubenswrapper[4816]: I0316 00:10:37.046578 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/machine-api-operator-5694c8668f-9xv4p" podStartSLOduration=179.046543651 podStartE2EDuration="2m59.046543651s" podCreationTimestamp="2026-03-16 00:07:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-16 00:10:37.045537853 +0000 UTC m=+230.141837806" watchObservedRunningTime="2026-03-16 00:10:37.046543651 +0000 UTC m=+230.142843604" Mar 16 00:10:37 crc kubenswrapper[4816]: W0316 00:10:37.070066 4816 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda1bc4b9a_e741_4f63_81e8_fdce3da0a5ea.slice/crio-2a2d73448d9f3e9c877e90218a56aa472e79752e5da512ecfe9f6dffb3aad02f WatchSource:0}: Error finding container 2a2d73448d9f3e9c877e90218a56aa472e79752e5da512ecfe9f6dffb3aad02f: Status 404 returned error can't find the container with id 2a2d73448d9f3e9c877e90218a56aa472e79752e5da512ecfe9f6dffb3aad02f Mar 16 00:10:37 crc kubenswrapper[4816]: I0316 00:10:37.108130 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-trp9l"] Mar 16 00:10:37 crc kubenswrapper[4816]: I0316 00:10:37.108687 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-879f6c89f-tv2n7" podStartSLOduration=179.108669637 podStartE2EDuration="2m59.108669637s" podCreationTimestamp="2026-03-16 00:07:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-16 00:10:37.097665317 +0000 UTC m=+230.193965270" watchObservedRunningTime="2026-03-16 00:10:37.108669637 +0000 UTC m=+230.204969590" Mar 16 00:10:37 crc kubenswrapper[4816]: I0316 00:10:37.110101 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 16 00:10:37 crc kubenswrapper[4816]: E0316 00:10:37.110170 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-16 00:10:37.610154618 +0000 UTC m=+230.706454571 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:37 crc kubenswrapper[4816]: I0316 00:10:37.117188 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ckvwn\" (UID: \"b155133b-d494-44bc-aa5d-23efc7cbd7a6\") " pod="openshift-image-registry/image-registry-697d97f7c8-ckvwn" Mar 16 00:10:37 crc kubenswrapper[4816]: E0316 00:10:37.118382 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-16 00:10:37.618363132 +0000 UTC m=+230.714663085 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ckvwn" (UID: "b155133b-d494-44bc-aa5d-23efc7cbd7a6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:37 crc kubenswrapper[4816]: I0316 00:10:37.147067 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication-operator/authentication-operator-69f744f599-6r96v" podStartSLOduration=180.147045305 podStartE2EDuration="3m0.147045305s" podCreationTimestamp="2026-03-16 00:07:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-16 00:10:37.122259218 +0000 UTC m=+230.218559171" watchObservedRunningTime="2026-03-16 00:10:37.147045305 +0000 UTC m=+230.243345258" Mar 16 00:10:37 crc kubenswrapper[4816]: I0316 00:10:37.152987 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-t8c6x" podStartSLOduration=179.152970267 podStartE2EDuration="2m59.152970267s" podCreationTimestamp="2026-03-16 00:07:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-16 00:10:37.152530045 +0000 UTC m=+230.248829998" watchObservedRunningTime="2026-03-16 00:10:37.152970267 +0000 UTC m=+230.249270220" Mar 16 00:10:37 crc kubenswrapper[4816]: I0316 00:10:37.202312 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-n7fkv" podStartSLOduration=180.202291603 podStartE2EDuration="3m0.202291603s" podCreationTimestamp="2026-03-16 00:07:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-16 00:10:37.189851194 +0000 UTC m=+230.286151157" watchObservedRunningTime="2026-03-16 00:10:37.202291603 +0000 UTC m=+230.298591556" Mar 16 00:10:37 crc kubenswrapper[4816]: I0316 00:10:37.218915 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 16 00:10:37 crc kubenswrapper[4816]: E0316 00:10:37.219242 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-16 00:10:37.719228806 +0000 UTC m=+230.815528759 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:37 crc kubenswrapper[4816]: I0316 00:10:37.240273 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-5444994796-gvk75" podStartSLOduration=179.24025488 podStartE2EDuration="2m59.24025488s" podCreationTimestamp="2026-03-16 00:07:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-16 00:10:37.238031559 +0000 UTC m=+230.334331502" watchObservedRunningTime="2026-03-16 00:10:37.24025488 +0000 UTC m=+230.336554833" Mar 16 00:10:37 crc kubenswrapper[4816]: I0316 00:10:37.309383 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-f9d7485db-nnqsw" podStartSLOduration=179.309364457 podStartE2EDuration="2m59.309364457s" podCreationTimestamp="2026-03-16 00:07:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-16 00:10:37.274192946 +0000 UTC m=+230.370492909" watchObservedRunningTime="2026-03-16 00:10:37.309364457 +0000 UTC m=+230.405664410" Mar 16 00:10:37 crc kubenswrapper[4816]: I0316 00:10:37.310726 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-d9j8j" podStartSLOduration=179.310716424 podStartE2EDuration="2m59.310716424s" podCreationTimestamp="2026-03-16 00:07:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-16 00:10:37.309067739 +0000 UTC m=+230.405367712" watchObservedRunningTime="2026-03-16 00:10:37.310716424 +0000 UTC m=+230.407016377" Mar 16 00:10:37 crc kubenswrapper[4816]: I0316 00:10:37.320666 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ckvwn\" (UID: \"b155133b-d494-44bc-aa5d-23efc7cbd7a6\") " pod="openshift-image-registry/image-registry-697d97f7c8-ckvwn" Mar 16 00:10:37 crc kubenswrapper[4816]: E0316 00:10:37.321007 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-16 00:10:37.820992744 +0000 UTC m=+230.917292697 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ckvwn" (UID: "b155133b-d494-44bc-aa5d-23efc7cbd7a6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:37 crc kubenswrapper[4816]: I0316 00:10:37.345111 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-server-vkr88" podStartSLOduration=5.345094262 podStartE2EDuration="5.345094262s" podCreationTimestamp="2026-03-16 00:10:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-16 00:10:37.344569428 +0000 UTC m=+230.440869381" watchObservedRunningTime="2026-03-16 00:10:37.345094262 +0000 UTC m=+230.441394215" Mar 16 00:10:37 crc kubenswrapper[4816]: I0316 00:10:37.387701 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-rqxss" podStartSLOduration=180.387686165 podStartE2EDuration="3m0.387686165s" podCreationTimestamp="2026-03-16 00:07:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-16 00:10:37.386138453 +0000 UTC m=+230.482438436" watchObservedRunningTime="2026-03-16 00:10:37.387686165 +0000 UTC m=+230.483986138" Mar 16 00:10:37 crc kubenswrapper[4816]: I0316 00:10:37.421977 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 16 00:10:37 crc kubenswrapper[4816]: E0316 00:10:37.422284 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-16 00:10:37.922270279 +0000 UTC m=+231.018570232 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:37 crc kubenswrapper[4816]: I0316 00:10:37.430495 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-fk9l7" podStartSLOduration=179.430477053 podStartE2EDuration="2m59.430477053s" podCreationTimestamp="2026-03-16 00:07:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-16 00:10:37.42960567 +0000 UTC m=+230.525905633" watchObservedRunningTime="2026-03-16 00:10:37.430477053 +0000 UTC m=+230.526777006" Mar 16 00:10:37 crc kubenswrapper[4816]: I0316 00:10:37.478091 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-pruner-29560320-s9q72" podStartSLOduration=180.478072313 podStartE2EDuration="3m0.478072313s" podCreationTimestamp="2026-03-16 00:07:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-16 00:10:37.476169551 +0000 UTC m=+230.572469504" watchObservedRunningTime="2026-03-16 00:10:37.478072313 +0000 UTC m=+230.574372266" Mar 16 00:10:37 crc kubenswrapper[4816]: I0316 00:10:37.524444 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ckvwn\" (UID: \"b155133b-d494-44bc-aa5d-23efc7cbd7a6\") " pod="openshift-image-registry/image-registry-697d97f7c8-ckvwn" Mar 16 00:10:37 crc kubenswrapper[4816]: E0316 00:10:37.524819 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-16 00:10:38.024799669 +0000 UTC m=+231.121099682 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ckvwn" (UID: "b155133b-d494-44bc-aa5d-23efc7cbd7a6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:37 crc kubenswrapper[4816]: I0316 00:10:37.553025 4816 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-5444994796-gvk75" Mar 16 00:10:37 crc kubenswrapper[4816]: I0316 00:10:37.553575 4816 patch_prober.go:28] interesting pod/router-default-5444994796-gvk75 container/router namespace/openshift-ingress: Startup probe status=failure output="Get \"http://localhost:1936/healthz/ready\": dial tcp [::1]:1936: connect: connection refused" start-of-body= Mar 16 00:10:37 crc kubenswrapper[4816]: I0316 00:10:37.553604 4816 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-gvk75" podUID="681ca8e4-f909-4e8b-9f35-5ab8ca382e44" containerName="router" probeResult="failure" output="Get \"http://localhost:1936/healthz/ready\": dial tcp [::1]:1936: connect: connection refused" Mar 16 00:10:37 crc kubenswrapper[4816]: I0316 00:10:37.628180 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 16 00:10:37 crc kubenswrapper[4816]: E0316 00:10:37.628353 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-16 00:10:38.128327485 +0000 UTC m=+231.224627438 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:37 crc kubenswrapper[4816]: I0316 00:10:37.628498 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ckvwn\" (UID: \"b155133b-d494-44bc-aa5d-23efc7cbd7a6\") " pod="openshift-image-registry/image-registry-697d97f7c8-ckvwn" Mar 16 00:10:37 crc kubenswrapper[4816]: E0316 00:10:37.628945 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-16 00:10:38.128936192 +0000 UTC m=+231.225236145 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ckvwn" (UID: "b155133b-d494-44bc-aa5d-23efc7cbd7a6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:37 crc kubenswrapper[4816]: I0316 00:10:37.729144 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 16 00:10:37 crc kubenswrapper[4816]: E0316 00:10:37.729369 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-16 00:10:38.229342943 +0000 UTC m=+231.325642906 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:37 crc kubenswrapper[4816]: I0316 00:10:37.729818 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ckvwn\" (UID: \"b155133b-d494-44bc-aa5d-23efc7cbd7a6\") " pod="openshift-image-registry/image-registry-697d97f7c8-ckvwn" Mar 16 00:10:37 crc kubenswrapper[4816]: E0316 00:10:37.730135 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-16 00:10:38.230122695 +0000 UTC m=+231.326422648 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ckvwn" (UID: "b155133b-d494-44bc-aa5d-23efc7cbd7a6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:37 crc kubenswrapper[4816]: I0316 00:10:37.830569 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 16 00:10:37 crc kubenswrapper[4816]: E0316 00:10:37.830944 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-16 00:10:38.330924257 +0000 UTC m=+231.427224210 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:37 crc kubenswrapper[4816]: I0316 00:10:37.931620 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ckvwn\" (UID: \"b155133b-d494-44bc-aa5d-23efc7cbd7a6\") " pod="openshift-image-registry/image-registry-697d97f7c8-ckvwn" Mar 16 00:10:37 crc kubenswrapper[4816]: E0316 00:10:37.932105 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-16 00:10:38.432090609 +0000 UTC m=+231.528390562 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ckvwn" (UID: "b155133b-d494-44bc-aa5d-23efc7cbd7a6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:38 crc kubenswrapper[4816]: I0316 00:10:38.013497 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-6nkm6" event={"ID":"db720c64-b1fa-48c9-a4b7-fc42f8ca47fd","Type":"ContainerStarted","Data":"26f97e4c130f331c2d50ce2b937d50024b4efd11592e9eb2859a5e9894578260"} Mar 16 00:10:38 crc kubenswrapper[4816]: I0316 00:10:38.013597 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-6nkm6" event={"ID":"db720c64-b1fa-48c9-a4b7-fc42f8ca47fd","Type":"ContainerStarted","Data":"a62d9dc9b5386cd41a4ae67f3beb9c6b4df9fc260cfe573952486ec5c5bc7a2e"} Mar 16 00:10:38 crc kubenswrapper[4816]: I0316 00:10:38.017055 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-mplx7" event={"ID":"7442ef1b-27ea-4166-8457-5332c4c8f363","Type":"ContainerStarted","Data":"4c1bf1d323eb433075e918aa55b3dfbdbcb746a1bc363fcfca0b609ae1dff650"} Mar 16 00:10:38 crc kubenswrapper[4816]: I0316 00:10:38.027443 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-trp9l" event={"ID":"a1a13dfd-7f8c-4a52-9ace-ffb14d3d91f2","Type":"ContainerStarted","Data":"dd4d838880c6bd7c297db64a0c1cea2342563e729a46d32282ed954ca2fecaad"} Mar 16 00:10:38 crc kubenswrapper[4816]: I0316 00:10:38.032393 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 16 00:10:38 crc kubenswrapper[4816]: E0316 00:10:38.032769 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-16 00:10:38.532751097 +0000 UTC m=+231.629051050 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:38 crc kubenswrapper[4816]: I0316 00:10:38.034177 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-nsxl4" event={"ID":"a1bc4b9a-e741-4f63-81e8-fdce3da0a5ea","Type":"ContainerStarted","Data":"166c68c117b0d047b57a181137d2c0acfbfb9d239261c31c810f6e12108921d9"} Mar 16 00:10:38 crc kubenswrapper[4816]: I0316 00:10:38.034216 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-nsxl4" event={"ID":"a1bc4b9a-e741-4f63-81e8-fdce3da0a5ea","Type":"ContainerStarted","Data":"2a2d73448d9f3e9c877e90218a56aa472e79752e5da512ecfe9f6dffb3aad02f"} Mar 16 00:10:38 crc kubenswrapper[4816]: I0316 00:10:38.038214 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-4t6gm" event={"ID":"f4bdfe91-48ef-4a44-99ba-d7ab90df9ec0","Type":"ContainerStarted","Data":"aa168efabbf9d392d454d37cbc266812b2313552f614751fa78eb0e89f861f35"} Mar 16 00:10:38 crc kubenswrapper[4816]: I0316 00:10:38.040011 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-d4vrm" event={"ID":"4f90d894-17c6-4800-a438-737fe8619e01","Type":"ContainerStarted","Data":"d9558b98ac0b8301b1e2fd81ab83d4eaebf891ae7f77f266a39b5bc52e74f754"} Mar 16 00:10:38 crc kubenswrapper[4816]: I0316 00:10:38.047693 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-pdm8d" event={"ID":"1f26ea52-1f97-4d4a-98bd-897c5b3b88c5","Type":"ContainerStarted","Data":"eedfb299c62b00dbca7e4f4925bca71acaf7c649798cb665bcff51f84f3f22cb"} Mar 16 00:10:38 crc kubenswrapper[4816]: I0316 00:10:38.053242 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zrq8d" event={"ID":"dced2102-9fd0-4300-9e0a-35d915f1caad","Type":"ContainerStarted","Data":"8a26f5496e8e0c593d07dd4aa2ec276b40aeb18ed7abdb0a1d52fdca818b2d58"} Mar 16 00:10:38 crc kubenswrapper[4816]: I0316 00:10:38.059761 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-gtwmf" event={"ID":"29bcfe72-6ef1-4087-9feb-787fdba3d2d7","Type":"ContainerStarted","Data":"04c28cf56e5a344151bb2ad50a201b5cc3a6cf82f7e0bf250c9d9e5100b6245e"} Mar 16 00:10:38 crc kubenswrapper[4816]: I0316 00:10:38.060065 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-gtwmf" Mar 16 00:10:38 crc kubenswrapper[4816]: I0316 00:10:38.062822 4816 patch_prober.go:28] interesting pod/catalog-operator-68c6474976-gtwmf container/catalog-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.24:8443/healthz\": dial tcp 10.217.0.24:8443: connect: connection refused" start-of-body= Mar 16 00:10:38 crc kubenswrapper[4816]: I0316 00:10:38.062881 4816 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-gtwmf" podUID="29bcfe72-6ef1-4087-9feb-787fdba3d2d7" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.24:8443/healthz\": dial tcp 10.217.0.24:8443: connect: connection refused" Mar 16 00:10:38 crc kubenswrapper[4816]: I0316 00:10:38.081202 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-v8zx4" event={"ID":"b98dcc1e-4c4b-47eb-9ddf-59a138f94247","Type":"ContainerStarted","Data":"02e89da47525606b5bdfb001ed5ff1164102abf4c6983f572471ec01e458a74c"} Mar 16 00:10:38 crc kubenswrapper[4816]: I0316 00:10:38.085294 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-jm8db" event={"ID":"9e737c04-a2db-452e-adc7-fa383e158b53","Type":"ContainerStarted","Data":"b073d0d4c2c60bff6d7b6f5d036b426a16e798c5b984e87897f11efb64f9289a"} Mar 16 00:10:38 crc kubenswrapper[4816]: I0316 00:10:38.085451 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-jm8db" event={"ID":"9e737c04-a2db-452e-adc7-fa383e158b53","Type":"ContainerStarted","Data":"92575e8e31aebcdde122ac5fae631766325c1f58e9a7f02918e7f2265ec057c1"} Mar 16 00:10:38 crc kubenswrapper[4816]: I0316 00:10:38.087600 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-zn6w7" event={"ID":"064b42ee-720b-456c-8ffe-a247f827befc","Type":"ContainerStarted","Data":"73fbecdf6a7fd7c801af9284efb8b05891d8738c1930c8fb1def974987b2f95f"} Mar 16 00:10:38 crc kubenswrapper[4816]: I0316 00:10:38.087861 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-zn6w7" Mar 16 00:10:38 crc kubenswrapper[4816]: I0316 00:10:38.089073 4816 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-zn6w7 container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.36:5443/healthz\": dial tcp 10.217.0.36:5443: connect: connection refused" start-of-body= Mar 16 00:10:38 crc kubenswrapper[4816]: I0316 00:10:38.089118 4816 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-zn6w7" podUID="064b42ee-720b-456c-8ffe-a247f827befc" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.36:5443/healthz\": dial tcp 10.217.0.36:5443: connect: connection refused" Mar 16 00:10:38 crc kubenswrapper[4816]: I0316 00:10:38.089576 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-vvdz2" event={"ID":"79ec9746-96c0-4fcd-b367-a42b6950145a","Type":"ContainerStarted","Data":"40a419244d9710129281429e56957e263d3d8536b057f748b088c553d5e63bc0"} Mar 16 00:10:38 crc kubenswrapper[4816]: I0316 00:10:38.090997 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-q9xc9" event={"ID":"4cc341f9-55c7-4bce-a0e3-24df68ca7f0e","Type":"ContainerStarted","Data":"1112e628b675b9cd7bfcdc017c14b275f18d39ad1cbe487a0c459256c84f03de"} Mar 16 00:10:38 crc kubenswrapper[4816]: I0316 00:10:38.091439 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console-operator/console-operator-58897d9998-q9xc9" Mar 16 00:10:38 crc kubenswrapper[4816]: I0316 00:10:38.092481 4816 patch_prober.go:28] interesting pod/console-operator-58897d9998-q9xc9 container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.13:8443/readyz\": dial tcp 10.217.0.13:8443: connect: connection refused" start-of-body= Mar 16 00:10:38 crc kubenswrapper[4816]: I0316 00:10:38.092519 4816 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-q9xc9" podUID="4cc341f9-55c7-4bce-a0e3-24df68ca7f0e" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.13:8443/readyz\": dial tcp 10.217.0.13:8443: connect: connection refused" Mar 16 00:10:38 crc kubenswrapper[4816]: I0316 00:10:38.093152 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-ll5r8" event={"ID":"aa9721ae-1aaa-4d49-9f05-5aabb9ab31d9","Type":"ContainerStarted","Data":"0cb716ff167593f0ce0b7cf0c62e4a01bf26679766bfd30f596995ff18105521"} Mar 16 00:10:38 crc kubenswrapper[4816]: I0316 00:10:38.093183 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-ll5r8" event={"ID":"aa9721ae-1aaa-4d49-9f05-5aabb9ab31d9","Type":"ContainerStarted","Data":"99594b5194aa148721f790813af4c45ce7463df5d5d273f411a500c85e93558e"} Mar 16 00:10:38 crc kubenswrapper[4816]: I0316 00:10:38.095849 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-2k6jt" event={"ID":"7759829d-d50c-4dd7-8627-040ebf8f0e40","Type":"ContainerStarted","Data":"ed6369d343eaa6ced138f1e08f8920e50b7e75f80249eeed063194ae330c409c"} Mar 16 00:10:38 crc kubenswrapper[4816]: I0316 00:10:38.096996 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29560320-4hk5d" event={"ID":"9397185e-a9e3-4ef4-b0be-d9dc9208adff","Type":"ContainerStarted","Data":"a26a0a4314d400c57bc06c18480ab7a501ebc981f4b8dbd60334dd3390aec49c"} Mar 16 00:10:38 crc kubenswrapper[4816]: I0316 00:10:38.098964 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-npvts" event={"ID":"fb4df3ac-df8b-4c9c-b7aa-dd9cd455a847","Type":"ContainerStarted","Data":"26cce752542010b9b78a872e41deb471ebd65a5cf9e7b8dce46f0460be533d3d"} Mar 16 00:10:38 crc kubenswrapper[4816]: I0316 00:10:38.100335 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-l648b" event={"ID":"ef3a7303-57a8-461f-86c1-fd3f7882e93b","Type":"ContainerStarted","Data":"d8a2b78be01e0a3cad62bf791050eb4476b6f7635f0071cf9f563caf64a00f35"} Mar 16 00:10:38 crc kubenswrapper[4816]: I0316 00:10:38.106831 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-mwhpz" event={"ID":"dc6dfded-ec9e-4a6f-a97b-3bb4cf8a149a","Type":"ContainerStarted","Data":"4c5f5a9190c5fd7efd62406f64a4bc30706c3b659d00a057ad00f198d807f0fa"} Mar 16 00:10:38 crc kubenswrapper[4816]: I0316 00:10:38.106885 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-mwhpz" event={"ID":"dc6dfded-ec9e-4a6f-a97b-3bb4cf8a149a","Type":"ContainerStarted","Data":"115b6339d9deaae8dfb122833be5a7a879426f2d399688991e46b804d72884b0"} Mar 16 00:10:38 crc kubenswrapper[4816]: I0316 00:10:38.125859 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-sh4ps" event={"ID":"0a98bca9-38d3-4382-a6d6-8410170f7d81","Type":"ContainerStarted","Data":"2fb8f1515730058de3be4b92067511b1f91b5ec7b522a8c73925e59f777d1b2a"} Mar 16 00:10:38 crc kubenswrapper[4816]: I0316 00:10:38.130497 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-fwkzt" event={"ID":"0386f821-c5fb-4dfd-acaf-706e214a57c0","Type":"ContainerStarted","Data":"1f81d667b086a11aa202533810c522896c89e2ea6f7fb8a20b1c273c15855d63"} Mar 16 00:10:38 crc kubenswrapper[4816]: I0316 00:10:38.130630 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-fwkzt" event={"ID":"0386f821-c5fb-4dfd-acaf-706e214a57c0","Type":"ContainerStarted","Data":"ff8e4bc0be4db6495a13d89a5084830a3005f00b79e93f30e8bb0988f12a09c0"} Mar 16 00:10:38 crc kubenswrapper[4816]: I0316 00:10:38.134137 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ckvwn\" (UID: \"b155133b-d494-44bc-aa5d-23efc7cbd7a6\") " pod="openshift-image-registry/image-registry-697d97f7c8-ckvwn" Mar 16 00:10:38 crc kubenswrapper[4816]: I0316 00:10:38.135169 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-5rr7c" event={"ID":"0ec3cdc0-f024-43cf-b520-7d2437e0f8df","Type":"ContainerStarted","Data":"9e1758947a169fa8c89c8e3873ca56d930c8ca55c7c143100afca371ccc218fc"} Mar 16 00:10:38 crc kubenswrapper[4816]: E0316 00:10:38.137020 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-16 00:10:38.636974583 +0000 UTC m=+231.733274646 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ckvwn" (UID: "b155133b-d494-44bc-aa5d-23efc7cbd7a6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:38 crc kubenswrapper[4816]: I0316 00:10:38.140078 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/downloads-7954f5f757-5rr7c" Mar 16 00:10:38 crc kubenswrapper[4816]: I0316 00:10:38.141501 4816 patch_prober.go:28] interesting pod/downloads-7954f5f757-5rr7c container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.14:8080/\": dial tcp 10.217.0.14:8080: connect: connection refused" start-of-body= Mar 16 00:10:38 crc kubenswrapper[4816]: I0316 00:10:38.141589 4816 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-5rr7c" podUID="0ec3cdc0-f024-43cf-b520-7d2437e0f8df" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.14:8080/\": dial tcp 10.217.0.14:8080: connect: connection refused" Mar 16 00:10:38 crc kubenswrapper[4816]: I0316 00:10:38.146381 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-fnmb9" event={"ID":"171b00f7-f7cf-41b3-bffd-11ceeb9f2182","Type":"ContainerStarted","Data":"024a1d4e9bfc2df97c747faa2472891f385dde870e0ae45dedd8fdf097d27c60"} Mar 16 00:10:38 crc kubenswrapper[4816]: I0316 00:10:38.153832 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-tgbjg" event={"ID":"34f93b2b-cc36-4965-992c-825bf2595e1e","Type":"ContainerStarted","Data":"7b60a65838b86ad7eb2fb071b4ccae28841d696a876e23f060b33e63b90b13a3"} Mar 16 00:10:38 crc kubenswrapper[4816]: I0316 00:10:38.160158 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-8tt9t" event={"ID":"3eb76779-61d8-4977-8839-083fcf6cd69b","Type":"ContainerStarted","Data":"a5ed6e71a594c27998c394e3bcbaac99853dd55430986ddf8a6c0c49260b7970"} Mar 16 00:10:38 crc kubenswrapper[4816]: I0316 00:10:38.161984 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-sshl5" event={"ID":"7c3e347f-464a-43f1-bf29-689bf81a28e6","Type":"ContainerStarted","Data":"3631ced358fcea8ef22224f7b1a8e3a7674d52e4a7296b38cf119840b4577b45"} Mar 16 00:10:38 crc kubenswrapper[4816]: I0316 00:10:38.165781 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-8226q" event={"ID":"02854230-6165-4f22-8780-d8591b991132","Type":"ContainerStarted","Data":"6a102196a4ace87bc37cea4d4ac25a7e1b7077cd996f17f48fd3ea1d9ccb059e"} Mar 16 00:10:38 crc kubenswrapper[4816]: I0316 00:10:38.165821 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-8226q" event={"ID":"02854230-6165-4f22-8780-d8591b991132","Type":"ContainerStarted","Data":"fbc545a6e69e36c7e153d8947909848cfdb5be666c80ed949869b9fabb25d45a"} Mar 16 00:10:38 crc kubenswrapper[4816]: I0316 00:10:38.166446 4816 patch_prober.go:28] interesting pod/route-controller-manager-6576b87f9c-d9j8j container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.11:8443/healthz\": dial tcp 10.217.0.11:8443: connect: connection refused" start-of-body= Mar 16 00:10:38 crc kubenswrapper[4816]: I0316 00:10:38.166515 4816 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-d9j8j" podUID="1d5466ab-a589-4f7e-ae89-2f494b10f6b1" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.11:8443/healthz\": dial tcp 10.217.0.11:8443: connect: connection refused" Mar 16 00:10:38 crc kubenswrapper[4816]: I0316 00:10:38.166617 4816 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-tv2n7 container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.25:8443/healthz\": dial tcp 10.217.0.25:8443: connect: connection refused" start-of-body= Mar 16 00:10:38 crc kubenswrapper[4816]: I0316 00:10:38.166649 4816 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-tv2n7" podUID="59c840a8-f288-44ed-83d3-34d47041c6c6" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.25:8443/healthz\": dial tcp 10.217.0.25:8443: connect: connection refused" Mar 16 00:10:38 crc kubenswrapper[4816]: I0316 00:10:38.227329 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-v8zx4" podStartSLOduration=181.227309509 podStartE2EDuration="3m1.227309509s" podCreationTimestamp="2026-03-16 00:07:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-16 00:10:38.225888621 +0000 UTC m=+231.322188574" watchObservedRunningTime="2026-03-16 00:10:38.227309509 +0000 UTC m=+231.323609462" Mar 16 00:10:38 crc kubenswrapper[4816]: I0316 00:10:38.235697 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 16 00:10:38 crc kubenswrapper[4816]: E0316 00:10:38.235882 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-16 00:10:38.735857093 +0000 UTC m=+231.832157046 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:38 crc kubenswrapper[4816]: I0316 00:10:38.238176 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ckvwn\" (UID: \"b155133b-d494-44bc-aa5d-23efc7cbd7a6\") " pod="openshift-image-registry/image-registry-697d97f7c8-ckvwn" Mar 16 00:10:38 crc kubenswrapper[4816]: E0316 00:10:38.238419 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-16 00:10:38.738404182 +0000 UTC m=+231.834704215 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ckvwn" (UID: "b155133b-d494-44bc-aa5d-23efc7cbd7a6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:38 crc kubenswrapper[4816]: I0316 00:10:38.270180 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns-operator/dns-operator-744455d44c-fnmb9" podStartSLOduration=180.270165539 podStartE2EDuration="3m0.270165539s" podCreationTimestamp="2026-03-16 00:07:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-16 00:10:38.268992987 +0000 UTC m=+231.365292940" watchObservedRunningTime="2026-03-16 00:10:38.270165539 +0000 UTC m=+231.366465492" Mar 16 00:10:38 crc kubenswrapper[4816]: I0316 00:10:38.308768 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zrq8d" podStartSLOduration=180.308747843 podStartE2EDuration="3m0.308747843s" podCreationTimestamp="2026-03-16 00:07:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-16 00:10:38.308359612 +0000 UTC m=+231.404659565" watchObservedRunningTime="2026-03-16 00:10:38.308747843 +0000 UTC m=+231.405047796" Mar 16 00:10:38 crc kubenswrapper[4816]: I0316 00:10:38.340038 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 16 00:10:38 crc kubenswrapper[4816]: E0316 00:10:38.340489 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-16 00:10:38.840475539 +0000 UTC m=+231.936775492 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:38 crc kubenswrapper[4816]: I0316 00:10:38.350308 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29560320-4hk5d" podStartSLOduration=181.350294877 podStartE2EDuration="3m1.350294877s" podCreationTimestamp="2026-03-16 00:07:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-16 00:10:38.349085364 +0000 UTC m=+231.445385307" watchObservedRunningTime="2026-03-16 00:10:38.350294877 +0000 UTC m=+231.446594830" Mar 16 00:10:38 crc kubenswrapper[4816]: I0316 00:10:38.386402 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-zn6w7" podStartSLOduration=180.386385173 podStartE2EDuration="3m0.386385173s" podCreationTimestamp="2026-03-16 00:07:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-16 00:10:38.385360995 +0000 UTC m=+231.481660948" watchObservedRunningTime="2026-03-16 00:10:38.386385173 +0000 UTC m=+231.482685126" Mar 16 00:10:38 crc kubenswrapper[4816]: I0316 00:10:38.441755 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ckvwn\" (UID: \"b155133b-d494-44bc-aa5d-23efc7cbd7a6\") " pod="openshift-image-registry/image-registry-697d97f7c8-ckvwn" Mar 16 00:10:38 crc kubenswrapper[4816]: E0316 00:10:38.442336 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-16 00:10:38.9423239 +0000 UTC m=+232.038623843 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ckvwn" (UID: "b155133b-d494-44bc-aa5d-23efc7cbd7a6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:38 crc kubenswrapper[4816]: I0316 00:10:38.469015 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-777779d784-vvdz2" podStartSLOduration=180.468997198 podStartE2EDuration="3m0.468997198s" podCreationTimestamp="2026-03-16 00:07:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-16 00:10:38.46687999 +0000 UTC m=+231.563179943" watchObservedRunningTime="2026-03-16 00:10:38.468997198 +0000 UTC m=+231.565297151" Mar 16 00:10:38 crc kubenswrapper[4816]: I0316 00:10:38.511527 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-58897d9998-q9xc9" podStartSLOduration=180.511508949 podStartE2EDuration="3m0.511508949s" podCreationTimestamp="2026-03-16 00:07:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-16 00:10:38.511109568 +0000 UTC m=+231.607409521" watchObservedRunningTime="2026-03-16 00:10:38.511508949 +0000 UTC m=+231.607808902" Mar 16 00:10:38 crc kubenswrapper[4816]: I0316 00:10:38.545817 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 16 00:10:38 crc kubenswrapper[4816]: E0316 00:10:38.546291 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-16 00:10:39.046272548 +0000 UTC m=+232.142572501 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:38 crc kubenswrapper[4816]: I0316 00:10:38.558189 4816 patch_prober.go:28] interesting pod/router-default-5444994796-gvk75 container/router namespace/openshift-ingress: Startup probe status=failure output="Get \"http://localhost:1936/healthz/ready\": dial tcp [::1]:1936: connect: connection refused" start-of-body= Mar 16 00:10:38 crc kubenswrapper[4816]: I0316 00:10:38.558623 4816 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-gvk75" podUID="681ca8e4-f909-4e8b-9f35-5ab8ca382e44" containerName="router" probeResult="failure" output="Get \"http://localhost:1936/healthz/ready\": dial tcp [::1]:1936: connect: connection refused" Mar 16 00:10:38 crc kubenswrapper[4816]: I0316 00:10:38.599685 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-sh4ps" podStartSLOduration=180.599662366 podStartE2EDuration="3m0.599662366s" podCreationTimestamp="2026-03-16 00:07:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-16 00:10:38.591936155 +0000 UTC m=+231.688236108" watchObservedRunningTime="2026-03-16 00:10:38.599662366 +0000 UTC m=+231.695962319" Mar 16 00:10:38 crc kubenswrapper[4816]: I0316 00:10:38.599912 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-7954f5f757-5rr7c" podStartSLOduration=180.599907742 podStartE2EDuration="3m0.599907742s" podCreationTimestamp="2026-03-16 00:07:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-16 00:10:38.555296434 +0000 UTC m=+231.651596387" watchObservedRunningTime="2026-03-16 00:10:38.599907742 +0000 UTC m=+231.696207695" Mar 16 00:10:38 crc kubenswrapper[4816]: I0316 00:10:38.633652 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-gtwmf" podStartSLOduration=180.633631183 podStartE2EDuration="3m0.633631183s" podCreationTimestamp="2026-03-16 00:07:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-16 00:10:38.631073443 +0000 UTC m=+231.727373396" watchObservedRunningTime="2026-03-16 00:10:38.633631183 +0000 UTC m=+231.729931136" Mar 16 00:10:38 crc kubenswrapper[4816]: I0316 00:10:38.648865 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ckvwn\" (UID: \"b155133b-d494-44bc-aa5d-23efc7cbd7a6\") " pod="openshift-image-registry/image-registry-697d97f7c8-ckvwn" Mar 16 00:10:38 crc kubenswrapper[4816]: E0316 00:10:38.649279 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-16 00:10:39.14926459 +0000 UTC m=+232.245564543 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ckvwn" (UID: "b155133b-d494-44bc-aa5d-23efc7cbd7a6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:38 crc kubenswrapper[4816]: I0316 00:10:38.675237 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-tgbjg" podStartSLOduration=180.675218779 podStartE2EDuration="3m0.675218779s" podCreationTimestamp="2026-03-16 00:07:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-16 00:10:38.671809846 +0000 UTC m=+231.768109799" watchObservedRunningTime="2026-03-16 00:10:38.675218779 +0000 UTC m=+231.771518732" Mar 16 00:10:38 crc kubenswrapper[4816]: I0316 00:10:38.750103 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 16 00:10:38 crc kubenswrapper[4816]: E0316 00:10:38.750514 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-16 00:10:39.250497294 +0000 UTC m=+232.346797247 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:38 crc kubenswrapper[4816]: I0316 00:10:38.852407 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ckvwn\" (UID: \"b155133b-d494-44bc-aa5d-23efc7cbd7a6\") " pod="openshift-image-registry/image-registry-697d97f7c8-ckvwn" Mar 16 00:10:38 crc kubenswrapper[4816]: E0316 00:10:38.852919 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-16 00:10:39.35290279 +0000 UTC m=+232.449202743 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ckvwn" (UID: "b155133b-d494-44bc-aa5d-23efc7cbd7a6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:38 crc kubenswrapper[4816]: I0316 00:10:38.953754 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 16 00:10:38 crc kubenswrapper[4816]: E0316 00:10:38.953935 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-16 00:10:39.453910018 +0000 UTC m=+232.550209971 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:38 crc kubenswrapper[4816]: I0316 00:10:38.954078 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ckvwn\" (UID: \"b155133b-d494-44bc-aa5d-23efc7cbd7a6\") " pod="openshift-image-registry/image-registry-697d97f7c8-ckvwn" Mar 16 00:10:38 crc kubenswrapper[4816]: E0316 00:10:38.954496 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-16 00:10:39.454482083 +0000 UTC m=+232.550782036 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ckvwn" (UID: "b155133b-d494-44bc-aa5d-23efc7cbd7a6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:39 crc kubenswrapper[4816]: I0316 00:10:39.054359 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zrq8d" Mar 16 00:10:39 crc kubenswrapper[4816]: I0316 00:10:39.054418 4816 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zrq8d" Mar 16 00:10:39 crc kubenswrapper[4816]: I0316 00:10:39.054861 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 16 00:10:39 crc kubenswrapper[4816]: E0316 00:10:39.055045 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-16 00:10:39.555016718 +0000 UTC m=+232.651316671 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:39 crc kubenswrapper[4816]: I0316 00:10:39.055203 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ckvwn\" (UID: \"b155133b-d494-44bc-aa5d-23efc7cbd7a6\") " pod="openshift-image-registry/image-registry-697d97f7c8-ckvwn" Mar 16 00:10:39 crc kubenswrapper[4816]: E0316 00:10:39.055668 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-16 00:10:39.555650886 +0000 UTC m=+232.651950839 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ckvwn" (UID: "b155133b-d494-44bc-aa5d-23efc7cbd7a6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:39 crc kubenswrapper[4816]: I0316 00:10:39.056187 4816 patch_prober.go:28] interesting pod/apiserver-7bbb656c7d-zrq8d container/oauth-apiserver namespace/openshift-oauth-apiserver: Startup probe status=failure output="Get \"https://10.217.0.10:8443/livez\": dial tcp 10.217.0.10:8443: connect: connection refused" start-of-body= Mar 16 00:10:39 crc kubenswrapper[4816]: I0316 00:10:39.056224 4816 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zrq8d" podUID="dced2102-9fd0-4300-9e0a-35d915f1caad" containerName="oauth-apiserver" probeResult="failure" output="Get \"https://10.217.0.10:8443/livez\": dial tcp 10.217.0.10:8443: connect: connection refused" Mar 16 00:10:39 crc kubenswrapper[4816]: I0316 00:10:39.156329 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 16 00:10:39 crc kubenswrapper[4816]: E0316 00:10:39.156482 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-16 00:10:39.656460938 +0000 UTC m=+232.752760901 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:39 crc kubenswrapper[4816]: I0316 00:10:39.156614 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ckvwn\" (UID: \"b155133b-d494-44bc-aa5d-23efc7cbd7a6\") " pod="openshift-image-registry/image-registry-697d97f7c8-ckvwn" Mar 16 00:10:39 crc kubenswrapper[4816]: E0316 00:10:39.156950 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-16 00:10:39.656940581 +0000 UTC m=+232.753240534 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ckvwn" (UID: "b155133b-d494-44bc-aa5d-23efc7cbd7a6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:39 crc kubenswrapper[4816]: I0316 00:10:39.179447 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-2k6jt" event={"ID":"7759829d-d50c-4dd7-8627-040ebf8f0e40","Type":"ContainerStarted","Data":"95989697f1af58eeb14eed6626f25568a7d89fa8659db716371d303537096b95"} Mar 16 00:10:39 crc kubenswrapper[4816]: I0316 00:10:39.190882 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-fwkzt" event={"ID":"0386f821-c5fb-4dfd-acaf-706e214a57c0","Type":"ContainerStarted","Data":"eb1247ca54e1725f92ff47a334cb3f93c7288edaf95c85bb29119f4190447728"} Mar 16 00:10:39 crc kubenswrapper[4816]: I0316 00:10:39.198969 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-8tt9t" event={"ID":"3eb76779-61d8-4977-8839-083fcf6cd69b","Type":"ContainerStarted","Data":"bfb1da3e05b684a07c1c5093f75a36fc7234932ef5ea1cdcd2af912b2efbadc3"} Mar 16 00:10:39 crc kubenswrapper[4816]: I0316 00:10:39.201790 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-2k6jt" podStartSLOduration=8.201772685 podStartE2EDuration="8.201772685s" podCreationTimestamp="2026-03-16 00:10:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-16 00:10:39.198059164 +0000 UTC m=+232.294359127" watchObservedRunningTime="2026-03-16 00:10:39.201772685 +0000 UTC m=+232.298072648" Mar 16 00:10:39 crc kubenswrapper[4816]: I0316 00:10:39.202005 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-pdm8d" event={"ID":"1f26ea52-1f97-4d4a-98bd-897c5b3b88c5","Type":"ContainerStarted","Data":"cd8239e30fa47ad0b09c897db8dede32e8baf15326f27d0d799c7d11f5bf9245"} Mar 16 00:10:39 crc kubenswrapper[4816]: I0316 00:10:39.212784 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-mplx7" event={"ID":"7442ef1b-27ea-4166-8457-5332c4c8f363","Type":"ContainerStarted","Data":"52c9358f54c69ab6c19056c97d3fd556c1de4b607e855be3201418922f2918bc"} Mar 16 00:10:39 crc kubenswrapper[4816]: I0316 00:10:39.216788 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-sshl5" event={"ID":"7c3e347f-464a-43f1-bf29-689bf81a28e6","Type":"ContainerStarted","Data":"e323e811f301454593a76b4a27e968d571ea79e5909647b904b1cad07862ea62"} Mar 16 00:10:39 crc kubenswrapper[4816]: I0316 00:10:39.217135 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-558db77b4-sshl5" Mar 16 00:10:39 crc kubenswrapper[4816]: I0316 00:10:39.218844 4816 patch_prober.go:28] interesting pod/oauth-openshift-558db77b4-sshl5 container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.23:6443/healthz\": dial tcp 10.217.0.23:6443: connect: connection refused" start-of-body= Mar 16 00:10:39 crc kubenswrapper[4816]: I0316 00:10:39.218881 4816 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-558db77b4-sshl5" podUID="7c3e347f-464a-43f1-bf29-689bf81a28e6" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.23:6443/healthz\": dial tcp 10.217.0.23:6443: connect: connection refused" Mar 16 00:10:39 crc kubenswrapper[4816]: I0316 00:10:39.225170 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-l648b" event={"ID":"ef3a7303-57a8-461f-86c1-fd3f7882e93b","Type":"ContainerStarted","Data":"848fdb51e4355de505e505755585dc2dce6c8c4c01ec5f0f58747f35b9c9b660"} Mar 16 00:10:39 crc kubenswrapper[4816]: I0316 00:10:39.227314 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-fwkzt" podStartSLOduration=181.227295802 podStartE2EDuration="3m1.227295802s" podCreationTimestamp="2026-03-16 00:07:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-16 00:10:39.225957115 +0000 UTC m=+232.322257068" watchObservedRunningTime="2026-03-16 00:10:39.227295802 +0000 UTC m=+232.323595755" Mar 16 00:10:39 crc kubenswrapper[4816]: I0316 00:10:39.239935 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-jm8db" event={"ID":"9e737c04-a2db-452e-adc7-fa383e158b53","Type":"ContainerStarted","Data":"d2b2e94e44481d7d62ca47549fa81cc832d88928edb0ed280b9862d6e4e1afa8"} Mar 16 00:10:39 crc kubenswrapper[4816]: I0316 00:10:39.240837 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-jm8db" Mar 16 00:10:39 crc kubenswrapper[4816]: I0316 00:10:39.244109 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-4t6gm" event={"ID":"f4bdfe91-48ef-4a44-99ba-d7ab90df9ec0","Type":"ContainerStarted","Data":"da9257f464c0b60fee5d912e86c720ccf6b77f86904d024f2ffea1f8dfd90424"} Mar 16 00:10:39 crc kubenswrapper[4816]: I0316 00:10:39.246734 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-8tt9t" podStartSLOduration=181.246720162 podStartE2EDuration="3m1.246720162s" podCreationTimestamp="2026-03-16 00:07:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-16 00:10:39.245271263 +0000 UTC m=+232.341571216" watchObservedRunningTime="2026-03-16 00:10:39.246720162 +0000 UTC m=+232.343020115" Mar 16 00:10:39 crc kubenswrapper[4816]: I0316 00:10:39.248018 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-npvts" event={"ID":"fb4df3ac-df8b-4c9c-b7aa-dd9cd455a847","Type":"ContainerStarted","Data":"78c7fcdf09f86057529d576ef181b7396649103d70b1badae9d8a06b15d9d653"} Mar 16 00:10:39 crc kubenswrapper[4816]: I0316 00:10:39.248053 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-npvts" event={"ID":"fb4df3ac-df8b-4c9c-b7aa-dd9cd455a847","Type":"ContainerStarted","Data":"c8352705d653ad742717cade5bf89d6cc224bcfeb440f69392e89c3a246c31ef"} Mar 16 00:10:39 crc kubenswrapper[4816]: I0316 00:10:39.248540 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-dns/dns-default-npvts" Mar 16 00:10:39 crc kubenswrapper[4816]: I0316 00:10:39.250088 4816 generic.go:334] "Generic (PLEG): container finished" podID="4f90d894-17c6-4800-a438-737fe8619e01" containerID="d9558b98ac0b8301b1e2fd81ab83d4eaebf891ae7f77f266a39b5bc52e74f754" exitCode=0 Mar 16 00:10:39 crc kubenswrapper[4816]: I0316 00:10:39.251525 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-d4vrm" event={"ID":"4f90d894-17c6-4800-a438-737fe8619e01","Type":"ContainerDied","Data":"d9558b98ac0b8301b1e2fd81ab83d4eaebf891ae7f77f266a39b5bc52e74f754"} Mar 16 00:10:39 crc kubenswrapper[4816]: I0316 00:10:39.251972 4816 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-zn6w7 container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.36:5443/healthz\": dial tcp 10.217.0.36:5443: connect: connection refused" start-of-body= Mar 16 00:10:39 crc kubenswrapper[4816]: I0316 00:10:39.255296 4816 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-zn6w7" podUID="064b42ee-720b-456c-8ffe-a247f827befc" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.36:5443/healthz\": dial tcp 10.217.0.36:5443: connect: connection refused" Mar 16 00:10:39 crc kubenswrapper[4816]: I0316 00:10:39.252292 4816 patch_prober.go:28] interesting pod/catalog-operator-68c6474976-gtwmf container/catalog-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.24:8443/healthz\": dial tcp 10.217.0.24:8443: connect: connection refused" start-of-body= Mar 16 00:10:39 crc kubenswrapper[4816]: I0316 00:10:39.255412 4816 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-gtwmf" podUID="29bcfe72-6ef1-4087-9feb-787fdba3d2d7" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.24:8443/healthz\": dial tcp 10.217.0.24:8443: connect: connection refused" Mar 16 00:10:39 crc kubenswrapper[4816]: I0316 00:10:39.252709 4816 patch_prober.go:28] interesting pod/console-operator-58897d9998-q9xc9 container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.13:8443/readyz\": dial tcp 10.217.0.13:8443: connect: connection refused" start-of-body= Mar 16 00:10:39 crc kubenswrapper[4816]: I0316 00:10:39.255447 4816 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-q9xc9" podUID="4cc341f9-55c7-4bce-a0e3-24df68ca7f0e" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.13:8443/readyz\": dial tcp 10.217.0.13:8443: connect: connection refused" Mar 16 00:10:39 crc kubenswrapper[4816]: I0316 00:10:39.254227 4816 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-tv2n7 container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.25:8443/healthz\": dial tcp 10.217.0.25:8443: connect: connection refused" start-of-body= Mar 16 00:10:39 crc kubenswrapper[4816]: I0316 00:10:39.255475 4816 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-tv2n7" podUID="59c840a8-f288-44ed-83d3-34d47041c6c6" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.25:8443/healthz\": dial tcp 10.217.0.25:8443: connect: connection refused" Mar 16 00:10:39 crc kubenswrapper[4816]: I0316 00:10:39.255519 4816 patch_prober.go:28] interesting pod/downloads-7954f5f757-5rr7c container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.14:8080/\": dial tcp 10.217.0.14:8080: connect: connection refused" start-of-body= Mar 16 00:10:39 crc kubenswrapper[4816]: I0316 00:10:39.255564 4816 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-5rr7c" podUID="0ec3cdc0-f024-43cf-b520-7d2437e0f8df" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.14:8080/\": dial tcp 10.217.0.14:8080: connect: connection refused" Mar 16 00:10:39 crc kubenswrapper[4816]: I0316 00:10:39.261384 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 16 00:10:39 crc kubenswrapper[4816]: E0316 00:10:39.261528 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-16 00:10:39.761502796 +0000 UTC m=+232.857802759 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:39 crc kubenswrapper[4816]: I0316 00:10:39.261894 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ckvwn\" (UID: \"b155133b-d494-44bc-aa5d-23efc7cbd7a6\") " pod="openshift-image-registry/image-registry-697d97f7c8-ckvwn" Mar 16 00:10:39 crc kubenswrapper[4816]: E0316 00:10:39.265163 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-16 00:10:39.765145255 +0000 UTC m=+232.861445258 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ckvwn" (UID: "b155133b-d494-44bc-aa5d-23efc7cbd7a6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:39 crc kubenswrapper[4816]: I0316 00:10:39.285320 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-558db77b4-sshl5" podStartSLOduration=182.285291975 podStartE2EDuration="3m2.285291975s" podCreationTimestamp="2026-03-16 00:07:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-16 00:10:39.28434996 +0000 UTC m=+232.380649913" watchObservedRunningTime="2026-03-16 00:10:39.285291975 +0000 UTC m=+232.381591928" Mar 16 00:10:39 crc kubenswrapper[4816]: I0316 00:10:39.321848 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver/apiserver-76f77b778f-pdm8d" podStartSLOduration=182.321832823 podStartE2EDuration="3m2.321832823s" podCreationTimestamp="2026-03-16 00:07:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-16 00:10:39.317180536 +0000 UTC m=+232.413480489" watchObservedRunningTime="2026-03-16 00:10:39.321832823 +0000 UTC m=+232.418132776" Mar 16 00:10:39 crc kubenswrapper[4816]: I0316 00:10:39.342185 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-l648b" podStartSLOduration=181.342165578 podStartE2EDuration="3m1.342165578s" podCreationTimestamp="2026-03-16 00:07:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-16 00:10:39.340827622 +0000 UTC m=+232.437127575" watchObservedRunningTime="2026-03-16 00:10:39.342165578 +0000 UTC m=+232.438465531" Mar 16 00:10:39 crc kubenswrapper[4816]: I0316 00:10:39.366077 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 16 00:10:39 crc kubenswrapper[4816]: E0316 00:10:39.367576 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-16 00:10:39.867530451 +0000 UTC m=+232.963830404 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:39 crc kubenswrapper[4816]: I0316 00:10:39.391087 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-admission-controller-857f4d67dd-mplx7" podStartSLOduration=181.391047763 podStartE2EDuration="3m1.391047763s" podCreationTimestamp="2026-03-16 00:07:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-16 00:10:39.382234182 +0000 UTC m=+232.478534135" watchObservedRunningTime="2026-03-16 00:10:39.391047763 +0000 UTC m=+232.487347716" Mar 16 00:10:39 crc kubenswrapper[4816]: I0316 00:10:39.402398 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-8226q" podStartSLOduration=181.402379202 podStartE2EDuration="3m1.402379202s" podCreationTimestamp="2026-03-16 00:07:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-16 00:10:39.401693784 +0000 UTC m=+232.497993737" watchObservedRunningTime="2026-03-16 00:10:39.402379202 +0000 UTC m=+232.498679155" Mar 16 00:10:39 crc kubenswrapper[4816]: I0316 00:10:39.431901 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-4t6gm" podStartSLOduration=181.431883478 podStartE2EDuration="3m1.431883478s" podCreationTimestamp="2026-03-16 00:07:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-16 00:10:39.430345706 +0000 UTC m=+232.526645659" watchObservedRunningTime="2026-03-16 00:10:39.431883478 +0000 UTC m=+232.528183431" Mar 16 00:10:39 crc kubenswrapper[4816]: I0316 00:10:39.453375 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-mwhpz" podStartSLOduration=181.453351934 podStartE2EDuration="3m1.453351934s" podCreationTimestamp="2026-03-16 00:07:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-16 00:10:39.451645537 +0000 UTC m=+232.547945490" watchObservedRunningTime="2026-03-16 00:10:39.453351934 +0000 UTC m=+232.549651887" Mar 16 00:10:39 crc kubenswrapper[4816]: I0316 00:10:39.469059 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ckvwn\" (UID: \"b155133b-d494-44bc-aa5d-23efc7cbd7a6\") " pod="openshift-image-registry/image-registry-697d97f7c8-ckvwn" Mar 16 00:10:39 crc kubenswrapper[4816]: E0316 00:10:39.469410 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-16 00:10:39.969397952 +0000 UTC m=+233.065697905 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ckvwn" (UID: "b155133b-d494-44bc-aa5d-23efc7cbd7a6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:39 crc kubenswrapper[4816]: I0316 00:10:39.521285 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-jm8db" podStartSLOduration=181.521267078 podStartE2EDuration="3m1.521267078s" podCreationTimestamp="2026-03-16 00:07:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-16 00:10:39.494027725 +0000 UTC m=+232.590327688" watchObservedRunningTime="2026-03-16 00:10:39.521267078 +0000 UTC m=+232.617567031" Mar 16 00:10:39 crc kubenswrapper[4816]: I0316 00:10:39.551009 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-nsxl4" podStartSLOduration=181.55099199 podStartE2EDuration="3m1.55099199s" podCreationTimestamp="2026-03-16 00:07:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-16 00:10:39.524440055 +0000 UTC m=+232.620740008" watchObservedRunningTime="2026-03-16 00:10:39.55099199 +0000 UTC m=+232.647291943" Mar 16 00:10:39 crc kubenswrapper[4816]: I0316 00:10:39.560300 4816 patch_prober.go:28] interesting pod/router-default-5444994796-gvk75 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 16 00:10:39 crc kubenswrapper[4816]: [-]has-synced failed: reason withheld Mar 16 00:10:39 crc kubenswrapper[4816]: [+]process-running ok Mar 16 00:10:39 crc kubenswrapper[4816]: healthz check failed Mar 16 00:10:39 crc kubenswrapper[4816]: I0316 00:10:39.560362 4816 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-gvk75" podUID="681ca8e4-f909-4e8b-9f35-5ab8ca382e44" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 16 00:10:39 crc kubenswrapper[4816]: I0316 00:10:39.571040 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 16 00:10:39 crc kubenswrapper[4816]: E0316 00:10:39.571451 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-16 00:10:40.071433268 +0000 UTC m=+233.167733221 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:39 crc kubenswrapper[4816]: I0316 00:10:39.588630 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-ll5r8" podStartSLOduration=181.588612237 podStartE2EDuration="3m1.588612237s" podCreationTimestamp="2026-03-16 00:07:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-16 00:10:39.553047026 +0000 UTC m=+232.649346979" watchObservedRunningTime="2026-03-16 00:10:39.588612237 +0000 UTC m=+232.684912190" Mar 16 00:10:39 crc kubenswrapper[4816]: I0316 00:10:39.591064 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-9c57cc56f-6nkm6" podStartSLOduration=181.591054334 podStartE2EDuration="3m1.591054334s" podCreationTimestamp="2026-03-16 00:07:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-16 00:10:39.588178405 +0000 UTC m=+232.684478358" watchObservedRunningTime="2026-03-16 00:10:39.591054334 +0000 UTC m=+232.687354287" Mar 16 00:10:39 crc kubenswrapper[4816]: I0316 00:10:39.672229 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ckvwn\" (UID: \"b155133b-d494-44bc-aa5d-23efc7cbd7a6\") " pod="openshift-image-registry/image-registry-697d97f7c8-ckvwn" Mar 16 00:10:39 crc kubenswrapper[4816]: E0316 00:10:39.672527 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-16 00:10:40.172515898 +0000 UTC m=+233.268815851 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ckvwn" (UID: "b155133b-d494-44bc-aa5d-23efc7cbd7a6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:39 crc kubenswrapper[4816]: I0316 00:10:39.737528 4816 ???:1] "http: TLS handshake error from 192.168.126.11:54916: no serving certificate available for the kubelet" Mar 16 00:10:39 crc kubenswrapper[4816]: I0316 00:10:39.773055 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 16 00:10:39 crc kubenswrapper[4816]: E0316 00:10:39.773470 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-16 00:10:40.273452724 +0000 UTC m=+233.369752677 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:39 crc kubenswrapper[4816]: I0316 00:10:39.785581 4816 ???:1] "http: TLS handshake error from 192.168.126.11:54918: no serving certificate available for the kubelet" Mar 16 00:10:39 crc kubenswrapper[4816]: I0316 00:10:39.833496 4816 ???:1] "http: TLS handshake error from 192.168.126.11:54922: no serving certificate available for the kubelet" Mar 16 00:10:39 crc kubenswrapper[4816]: I0316 00:10:39.875095 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ckvwn\" (UID: \"b155133b-d494-44bc-aa5d-23efc7cbd7a6\") " pod="openshift-image-registry/image-registry-697d97f7c8-ckvwn" Mar 16 00:10:39 crc kubenswrapper[4816]: E0316 00:10:39.875580 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-16 00:10:40.375566332 +0000 UTC m=+233.471866285 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ckvwn" (UID: "b155133b-d494-44bc-aa5d-23efc7cbd7a6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:39 crc kubenswrapper[4816]: I0316 00:10:39.893090 4816 ???:1] "http: TLS handshake error from 192.168.126.11:54930: no serving certificate available for the kubelet" Mar 16 00:10:39 crc kubenswrapper[4816]: I0316 00:10:39.959024 4816 ???:1] "http: TLS handshake error from 192.168.126.11:54944: no serving certificate available for the kubelet" Mar 16 00:10:39 crc kubenswrapper[4816]: I0316 00:10:39.976810 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 16 00:10:39 crc kubenswrapper[4816]: E0316 00:10:39.976987 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-16 00:10:40.47695934 +0000 UTC m=+233.573259293 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:39 crc kubenswrapper[4816]: I0316 00:10:39.977113 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ckvwn\" (UID: \"b155133b-d494-44bc-aa5d-23efc7cbd7a6\") " pod="openshift-image-registry/image-registry-697d97f7c8-ckvwn" Mar 16 00:10:39 crc kubenswrapper[4816]: E0316 00:10:39.977676 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-16 00:10:40.477662519 +0000 UTC m=+233.573962472 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ckvwn" (UID: "b155133b-d494-44bc-aa5d-23efc7cbd7a6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:40 crc kubenswrapper[4816]: I0316 00:10:40.058324 4816 ???:1] "http: TLS handshake error from 192.168.126.11:54956: no serving certificate available for the kubelet" Mar 16 00:10:40 crc kubenswrapper[4816]: I0316 00:10:40.078918 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 16 00:10:40 crc kubenswrapper[4816]: E0316 00:10:40.079149 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-16 00:10:40.579120619 +0000 UTC m=+233.675420582 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:40 crc kubenswrapper[4816]: I0316 00:10:40.079231 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ckvwn\" (UID: \"b155133b-d494-44bc-aa5d-23efc7cbd7a6\") " pod="openshift-image-registry/image-registry-697d97f7c8-ckvwn" Mar 16 00:10:40 crc kubenswrapper[4816]: E0316 00:10:40.079668 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-16 00:10:40.579656944 +0000 UTC m=+233.675956977 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ckvwn" (UID: "b155133b-d494-44bc-aa5d-23efc7cbd7a6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:40 crc kubenswrapper[4816]: I0316 00:10:40.180923 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 16 00:10:40 crc kubenswrapper[4816]: E0316 00:10:40.181710 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-16 00:10:40.68168809 +0000 UTC m=+233.777988043 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:40 crc kubenswrapper[4816]: I0316 00:10:40.252676 4816 ???:1] "http: TLS handshake error from 192.168.126.11:54964: no serving certificate available for the kubelet" Mar 16 00:10:40 crc kubenswrapper[4816]: I0316 00:10:40.271472 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-d4vrm" event={"ID":"4f90d894-17c6-4800-a438-737fe8619e01","Type":"ContainerStarted","Data":"398869ad6a60ce3b3a8c87e03134cd0a8845b94fb0f7932db64c46ad9a35842c"} Mar 16 00:10:40 crc kubenswrapper[4816]: I0316 00:10:40.271539 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-config-operator/openshift-config-operator-7777fb866f-d4vrm" Mar 16 00:10:40 crc kubenswrapper[4816]: I0316 00:10:40.274111 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-trp9l" event={"ID":"a1a13dfd-7f8c-4a52-9ace-ffb14d3d91f2","Type":"ContainerStarted","Data":"3c467605f4551d8c3d5668a70ac90cbcf3f0ed8dd7b0ae3d6b85b6ea1fe8119c"} Mar 16 00:10:40 crc kubenswrapper[4816]: I0316 00:10:40.275221 4816 patch_prober.go:28] interesting pod/catalog-operator-68c6474976-gtwmf container/catalog-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.24:8443/healthz\": dial tcp 10.217.0.24:8443: connect: connection refused" start-of-body= Mar 16 00:10:40 crc kubenswrapper[4816]: I0316 00:10:40.275263 4816 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-gtwmf" podUID="29bcfe72-6ef1-4087-9feb-787fdba3d2d7" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.24:8443/healthz\": dial tcp 10.217.0.24:8443: connect: connection refused" Mar 16 00:10:40 crc kubenswrapper[4816]: I0316 00:10:40.275436 4816 patch_prober.go:28] interesting pod/downloads-7954f5f757-5rr7c container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.14:8080/\": dial tcp 10.217.0.14:8080: connect: connection refused" start-of-body= Mar 16 00:10:40 crc kubenswrapper[4816]: I0316 00:10:40.275235 4816 patch_prober.go:28] interesting pod/oauth-openshift-558db77b4-sshl5 container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.23:6443/healthz\": dial tcp 10.217.0.23:6443: connect: connection refused" start-of-body= Mar 16 00:10:40 crc kubenswrapper[4816]: I0316 00:10:40.275636 4816 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-558db77b4-sshl5" podUID="7c3e347f-464a-43f1-bf29-689bf81a28e6" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.23:6443/healthz\": dial tcp 10.217.0.23:6443: connect: connection refused" Mar 16 00:10:40 crc kubenswrapper[4816]: I0316 00:10:40.275631 4816 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-5rr7c" podUID="0ec3cdc0-f024-43cf-b520-7d2437e0f8df" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.14:8080/\": dial tcp 10.217.0.14:8080: connect: connection refused" Mar 16 00:10:40 crc kubenswrapper[4816]: I0316 00:10:40.282596 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ckvwn\" (UID: \"b155133b-d494-44bc-aa5d-23efc7cbd7a6\") " pod="openshift-image-registry/image-registry-697d97f7c8-ckvwn" Mar 16 00:10:40 crc kubenswrapper[4816]: E0316 00:10:40.282934 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-16 00:10:40.782922184 +0000 UTC m=+233.879222137 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ckvwn" (UID: "b155133b-d494-44bc-aa5d-23efc7cbd7a6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:40 crc kubenswrapper[4816]: I0316 00:10:40.291749 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-npvts" podStartSLOduration=8.291727524 podStartE2EDuration="8.291727524s" podCreationTimestamp="2026-03-16 00:10:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-16 00:10:39.649596672 +0000 UTC m=+232.745896625" watchObservedRunningTime="2026-03-16 00:10:40.291727524 +0000 UTC m=+233.388027477" Mar 16 00:10:40 crc kubenswrapper[4816]: I0316 00:10:40.292785 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-config-operator/openshift-config-operator-7777fb866f-d4vrm" podStartSLOduration=183.292775353 podStartE2EDuration="3m3.292775353s" podCreationTimestamp="2026-03-16 00:07:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-16 00:10:40.288701362 +0000 UTC m=+233.385001315" watchObservedRunningTime="2026-03-16 00:10:40.292775353 +0000 UTC m=+233.389075306" Mar 16 00:10:40 crc kubenswrapper[4816]: I0316 00:10:40.383804 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 16 00:10:40 crc kubenswrapper[4816]: E0316 00:10:40.383947 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-16 00:10:40.883923921 +0000 UTC m=+233.980223874 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:40 crc kubenswrapper[4816]: I0316 00:10:40.384271 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ckvwn\" (UID: \"b155133b-d494-44bc-aa5d-23efc7cbd7a6\") " pod="openshift-image-registry/image-registry-697d97f7c8-ckvwn" Mar 16 00:10:40 crc kubenswrapper[4816]: E0316 00:10:40.389091 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-16 00:10:40.889074602 +0000 UTC m=+233.985374635 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ckvwn" (UID: "b155133b-d494-44bc-aa5d-23efc7cbd7a6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:40 crc kubenswrapper[4816]: I0316 00:10:40.485734 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 16 00:10:40 crc kubenswrapper[4816]: E0316 00:10:40.485897 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-16 00:10:40.985876094 +0000 UTC m=+234.082176047 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:40 crc kubenswrapper[4816]: I0316 00:10:40.486335 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ckvwn\" (UID: \"b155133b-d494-44bc-aa5d-23efc7cbd7a6\") " pod="openshift-image-registry/image-registry-697d97f7c8-ckvwn" Mar 16 00:10:40 crc kubenswrapper[4816]: E0316 00:10:40.486665 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-16 00:10:40.986655025 +0000 UTC m=+234.082954978 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ckvwn" (UID: "b155133b-d494-44bc-aa5d-23efc7cbd7a6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:40 crc kubenswrapper[4816]: I0316 00:10:40.556141 4816 patch_prober.go:28] interesting pod/router-default-5444994796-gvk75 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 16 00:10:40 crc kubenswrapper[4816]: [-]has-synced failed: reason withheld Mar 16 00:10:40 crc kubenswrapper[4816]: [+]process-running ok Mar 16 00:10:40 crc kubenswrapper[4816]: healthz check failed Mar 16 00:10:40 crc kubenswrapper[4816]: I0316 00:10:40.556213 4816 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-gvk75" podUID="681ca8e4-f909-4e8b-9f35-5ab8ca382e44" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 16 00:10:40 crc kubenswrapper[4816]: I0316 00:10:40.587454 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 16 00:10:40 crc kubenswrapper[4816]: E0316 00:10:40.587689 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-16 00:10:41.087643013 +0000 UTC m=+234.183942976 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:40 crc kubenswrapper[4816]: I0316 00:10:40.587951 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ckvwn\" (UID: \"b155133b-d494-44bc-aa5d-23efc7cbd7a6\") " pod="openshift-image-registry/image-registry-697d97f7c8-ckvwn" Mar 16 00:10:40 crc kubenswrapper[4816]: E0316 00:10:40.588308 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-16 00:10:41.08829064 +0000 UTC m=+234.184590623 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ckvwn" (UID: "b155133b-d494-44bc-aa5d-23efc7cbd7a6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:40 crc kubenswrapper[4816]: I0316 00:10:40.628088 4816 ???:1] "http: TLS handshake error from 192.168.126.11:54972: no serving certificate available for the kubelet" Mar 16 00:10:40 crc kubenswrapper[4816]: I0316 00:10:40.688844 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 16 00:10:40 crc kubenswrapper[4816]: E0316 00:10:40.689157 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-16 00:10:41.189108543 +0000 UTC m=+234.285408496 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:40 crc kubenswrapper[4816]: I0316 00:10:40.791414 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ckvwn\" (UID: \"b155133b-d494-44bc-aa5d-23efc7cbd7a6\") " pod="openshift-image-registry/image-registry-697d97f7c8-ckvwn" Mar 16 00:10:40 crc kubenswrapper[4816]: E0316 00:10:40.791836 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-16 00:10:41.291814917 +0000 UTC m=+234.388114870 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ckvwn" (UID: "b155133b-d494-44bc-aa5d-23efc7cbd7a6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:40 crc kubenswrapper[4816]: I0316 00:10:40.892849 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 16 00:10:40 crc kubenswrapper[4816]: E0316 00:10:40.893081 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-16 00:10:41.393034801 +0000 UTC m=+234.489334754 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:40 crc kubenswrapper[4816]: I0316 00:10:40.893168 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ckvwn\" (UID: \"b155133b-d494-44bc-aa5d-23efc7cbd7a6\") " pod="openshift-image-registry/image-registry-697d97f7c8-ckvwn" Mar 16 00:10:40 crc kubenswrapper[4816]: E0316 00:10:40.893503 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-16 00:10:41.393485323 +0000 UTC m=+234.489785276 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ckvwn" (UID: "b155133b-d494-44bc-aa5d-23efc7cbd7a6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:40 crc kubenswrapper[4816]: I0316 00:10:40.994090 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 16 00:10:40 crc kubenswrapper[4816]: E0316 00:10:40.996848 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-16 00:10:41.496825865 +0000 UTC m=+234.593125818 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:40 crc kubenswrapper[4816]: I0316 00:10:40.997008 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ckvwn\" (UID: \"b155133b-d494-44bc-aa5d-23efc7cbd7a6\") " pod="openshift-image-registry/image-registry-697d97f7c8-ckvwn" Mar 16 00:10:40 crc kubenswrapper[4816]: E0316 00:10:40.997430 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-16 00:10:41.497417981 +0000 UTC m=+234.593717934 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ckvwn" (UID: "b155133b-d494-44bc-aa5d-23efc7cbd7a6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:41 crc kubenswrapper[4816]: I0316 00:10:41.098139 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 16 00:10:41 crc kubenswrapper[4816]: E0316 00:10:41.098527 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-16 00:10:41.598508081 +0000 UTC m=+234.694808034 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:41 crc kubenswrapper[4816]: I0316 00:10:41.199276 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ckvwn\" (UID: \"b155133b-d494-44bc-aa5d-23efc7cbd7a6\") " pod="openshift-image-registry/image-registry-697d97f7c8-ckvwn" Mar 16 00:10:41 crc kubenswrapper[4816]: E0316 00:10:41.199653 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-16 00:10:41.699637192 +0000 UTC m=+234.795937145 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ckvwn" (UID: "b155133b-d494-44bc-aa5d-23efc7cbd7a6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:41 crc kubenswrapper[4816]: I0316 00:10:41.300477 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 16 00:10:41 crc kubenswrapper[4816]: E0316 00:10:41.300666 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-16 00:10:41.800636769 +0000 UTC m=+234.896936732 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:41 crc kubenswrapper[4816]: I0316 00:10:41.374522 4816 ???:1] "http: TLS handshake error from 192.168.126.11:54984: no serving certificate available for the kubelet" Mar 16 00:10:41 crc kubenswrapper[4816]: I0316 00:10:41.421326 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ckvwn\" (UID: \"b155133b-d494-44bc-aa5d-23efc7cbd7a6\") " pod="openshift-image-registry/image-registry-697d97f7c8-ckvwn" Mar 16 00:10:41 crc kubenswrapper[4816]: E0316 00:10:41.421722 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-16 00:10:41.921707235 +0000 UTC m=+235.018007188 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ckvwn" (UID: "b155133b-d494-44bc-aa5d-23efc7cbd7a6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:41 crc kubenswrapper[4816]: I0316 00:10:41.522870 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 16 00:10:41 crc kubenswrapper[4816]: E0316 00:10:41.523158 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-16 00:10:42.023106564 +0000 UTC m=+235.119406527 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:41 crc kubenswrapper[4816]: I0316 00:10:41.523598 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ckvwn\" (UID: \"b155133b-d494-44bc-aa5d-23efc7cbd7a6\") " pod="openshift-image-registry/image-registry-697d97f7c8-ckvwn" Mar 16 00:10:41 crc kubenswrapper[4816]: E0316 00:10:41.524014 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-16 00:10:42.023996268 +0000 UTC m=+235.120296421 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ckvwn" (UID: "b155133b-d494-44bc-aa5d-23efc7cbd7a6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:41 crc kubenswrapper[4816]: I0316 00:10:41.553008 4816 patch_prober.go:28] interesting pod/router-default-5444994796-gvk75 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 16 00:10:41 crc kubenswrapper[4816]: [-]has-synced failed: reason withheld Mar 16 00:10:41 crc kubenswrapper[4816]: [+]process-running ok Mar 16 00:10:41 crc kubenswrapper[4816]: healthz check failed Mar 16 00:10:41 crc kubenswrapper[4816]: I0316 00:10:41.553071 4816 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-gvk75" podUID="681ca8e4-f909-4e8b-9f35-5ab8ca382e44" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 16 00:10:41 crc kubenswrapper[4816]: I0316 00:10:41.624561 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 16 00:10:41 crc kubenswrapper[4816]: E0316 00:10:41.625099 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-16 00:10:42.125079908 +0000 UTC m=+235.221379881 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:41 crc kubenswrapper[4816]: I0316 00:10:41.726822 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ckvwn\" (UID: \"b155133b-d494-44bc-aa5d-23efc7cbd7a6\") " pod="openshift-image-registry/image-registry-697d97f7c8-ckvwn" Mar 16 00:10:41 crc kubenswrapper[4816]: E0316 00:10:41.727338 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-16 00:10:42.227317069 +0000 UTC m=+235.323617092 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ckvwn" (UID: "b155133b-d494-44bc-aa5d-23efc7cbd7a6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:41 crc kubenswrapper[4816]: I0316 00:10:41.829562 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 16 00:10:41 crc kubenswrapper[4816]: E0316 00:10:41.830015 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-16 00:10:42.329997983 +0000 UTC m=+235.426297936 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:41 crc kubenswrapper[4816]: I0316 00:10:41.931261 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ckvwn\" (UID: \"b155133b-d494-44bc-aa5d-23efc7cbd7a6\") " pod="openshift-image-registry/image-registry-697d97f7c8-ckvwn" Mar 16 00:10:41 crc kubenswrapper[4816]: E0316 00:10:41.931707 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-16 00:10:42.431687289 +0000 UTC m=+235.527987332 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ckvwn" (UID: "b155133b-d494-44bc-aa5d-23efc7cbd7a6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:42 crc kubenswrapper[4816]: I0316 00:10:42.032178 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 16 00:10:42 crc kubenswrapper[4816]: E0316 00:10:42.032309 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-16 00:10:42.532292056 +0000 UTC m=+235.628592009 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:42 crc kubenswrapper[4816]: I0316 00:10:42.032428 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ckvwn\" (UID: \"b155133b-d494-44bc-aa5d-23efc7cbd7a6\") " pod="openshift-image-registry/image-registry-697d97f7c8-ckvwn" Mar 16 00:10:42 crc kubenswrapper[4816]: E0316 00:10:42.032748 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-16 00:10:42.532741028 +0000 UTC m=+235.629040981 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ckvwn" (UID: "b155133b-d494-44bc-aa5d-23efc7cbd7a6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:42 crc kubenswrapper[4816]: I0316 00:10:42.133504 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 16 00:10:42 crc kubenswrapper[4816]: E0316 00:10:42.133804 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-16 00:10:42.633762676 +0000 UTC m=+235.730062629 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:42 crc kubenswrapper[4816]: I0316 00:10:42.133883 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ckvwn\" (UID: \"b155133b-d494-44bc-aa5d-23efc7cbd7a6\") " pod="openshift-image-registry/image-registry-697d97f7c8-ckvwn" Mar 16 00:10:42 crc kubenswrapper[4816]: E0316 00:10:42.134207 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-16 00:10:42.634190218 +0000 UTC m=+235.730490171 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ckvwn" (UID: "b155133b-d494-44bc-aa5d-23efc7cbd7a6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:42 crc kubenswrapper[4816]: I0316 00:10:42.236488 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 16 00:10:42 crc kubenswrapper[4816]: E0316 00:10:42.236651 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-16 00:10:42.736631695 +0000 UTC m=+235.832931648 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:42 crc kubenswrapper[4816]: I0316 00:10:42.236724 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ckvwn\" (UID: \"b155133b-d494-44bc-aa5d-23efc7cbd7a6\") " pod="openshift-image-registry/image-registry-697d97f7c8-ckvwn" Mar 16 00:10:42 crc kubenswrapper[4816]: E0316 00:10:42.237064 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-16 00:10:42.737056787 +0000 UTC m=+235.833356740 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ckvwn" (UID: "b155133b-d494-44bc-aa5d-23efc7cbd7a6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:42 crc kubenswrapper[4816]: I0316 00:10:42.336188 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Mar 16 00:10:42 crc kubenswrapper[4816]: I0316 00:10:42.337004 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 16 00:10:42 crc kubenswrapper[4816]: I0316 00:10:42.337361 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 16 00:10:42 crc kubenswrapper[4816]: E0316 00:10:42.337592 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-16 00:10:42.83753621 +0000 UTC m=+235.933836173 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:42 crc kubenswrapper[4816]: I0316 00:10:42.337661 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ckvwn\" (UID: \"b155133b-d494-44bc-aa5d-23efc7cbd7a6\") " pod="openshift-image-registry/image-registry-697d97f7c8-ckvwn" Mar 16 00:10:42 crc kubenswrapper[4816]: E0316 00:10:42.337891 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-16 00:10:42.837880039 +0000 UTC m=+235.934179992 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ckvwn" (UID: "b155133b-d494-44bc-aa5d-23efc7cbd7a6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:42 crc kubenswrapper[4816]: I0316 00:10:42.349132 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-4gwcw"] Mar 16 00:10:42 crc kubenswrapper[4816]: I0316 00:10:42.350040 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-4gwcw" Mar 16 00:10:42 crc kubenswrapper[4816]: I0316 00:10:42.354963 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Mar 16 00:10:42 crc kubenswrapper[4816]: I0316 00:10:42.355151 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager"/"installer-sa-dockercfg-kjl2n" Mar 16 00:10:42 crc kubenswrapper[4816]: I0316 00:10:42.357917 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-trp9l" event={"ID":"a1a13dfd-7f8c-4a52-9ace-ffb14d3d91f2","Type":"ContainerStarted","Data":"568dcbbc8f060aeb28cb18242ceb31f90f8759aece7ef7d23357f966a0c5ba20"} Mar 16 00:10:42 crc kubenswrapper[4816]: I0316 00:10:42.358142 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager"/"kube-root-ca.crt" Mar 16 00:10:42 crc kubenswrapper[4816]: I0316 00:10:42.404329 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Mar 16 00:10:42 crc kubenswrapper[4816]: I0316 00:10:42.406592 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-4gwcw"] Mar 16 00:10:42 crc kubenswrapper[4816]: I0316 00:10:42.461195 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 16 00:10:42 crc kubenswrapper[4816]: I0316 00:10:42.461390 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-45bbd\" (UniqueName: \"kubernetes.io/projected/ad80e1a9-75dc-4860-9bd9-d59b0c0ae43c-kube-api-access-45bbd\") pod \"community-operators-4gwcw\" (UID: \"ad80e1a9-75dc-4860-9bd9-d59b0c0ae43c\") " pod="openshift-marketplace/community-operators-4gwcw" Mar 16 00:10:42 crc kubenswrapper[4816]: I0316 00:10:42.461445 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/46c3b4df-48c2-4131-8ac4-ea6276d70d54-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"46c3b4df-48c2-4131-8ac4-ea6276d70d54\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 16 00:10:42 crc kubenswrapper[4816]: I0316 00:10:42.461510 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ad80e1a9-75dc-4860-9bd9-d59b0c0ae43c-utilities\") pod \"community-operators-4gwcw\" (UID: \"ad80e1a9-75dc-4860-9bd9-d59b0c0ae43c\") " pod="openshift-marketplace/community-operators-4gwcw" Mar 16 00:10:42 crc kubenswrapper[4816]: I0316 00:10:42.461527 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ad80e1a9-75dc-4860-9bd9-d59b0c0ae43c-catalog-content\") pod \"community-operators-4gwcw\" (UID: \"ad80e1a9-75dc-4860-9bd9-d59b0c0ae43c\") " pod="openshift-marketplace/community-operators-4gwcw" Mar 16 00:10:42 crc kubenswrapper[4816]: I0316 00:10:42.461563 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/46c3b4df-48c2-4131-8ac4-ea6276d70d54-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"46c3b4df-48c2-4131-8ac4-ea6276d70d54\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 16 00:10:42 crc kubenswrapper[4816]: E0316 00:10:42.462161 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-16 00:10:42.962143352 +0000 UTC m=+236.058443305 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:42 crc kubenswrapper[4816]: I0316 00:10:42.466089 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-tv2n7"] Mar 16 00:10:42 crc kubenswrapper[4816]: I0316 00:10:42.466299 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-879f6c89f-tv2n7" podUID="59c840a8-f288-44ed-83d3-34d47041c6c6" containerName="controller-manager" containerID="cri-o://c46a7076608f889c8e30b77b33715ed49c92e64799e40fe88b9f99f6e980f6a5" gracePeriod=30 Mar 16 00:10:42 crc kubenswrapper[4816]: I0316 00:10:42.480906 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-879f6c89f-tv2n7" Mar 16 00:10:42 crc kubenswrapper[4816]: I0316 00:10:42.549368 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-d9j8j"] Mar 16 00:10:42 crc kubenswrapper[4816]: I0316 00:10:42.549633 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-d9j8j" podUID="1d5466ab-a589-4f7e-ae89-2f494b10f6b1" containerName="route-controller-manager" containerID="cri-o://e90fdfac87f05e45b64d63ce5cb4d5902fbd18d9c1d580577069351527db0c29" gracePeriod=30 Mar 16 00:10:42 crc kubenswrapper[4816]: I0316 00:10:42.557987 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-d9j8j" Mar 16 00:10:42 crc kubenswrapper[4816]: I0316 00:10:42.562010 4816 patch_prober.go:28] interesting pod/router-default-5444994796-gvk75 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 16 00:10:42 crc kubenswrapper[4816]: [-]has-synced failed: reason withheld Mar 16 00:10:42 crc kubenswrapper[4816]: [+]process-running ok Mar 16 00:10:42 crc kubenswrapper[4816]: healthz check failed Mar 16 00:10:42 crc kubenswrapper[4816]: I0316 00:10:42.562066 4816 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-gvk75" podUID="681ca8e4-f909-4e8b-9f35-5ab8ca382e44" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 16 00:10:42 crc kubenswrapper[4816]: I0316 00:10:42.562338 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-45bbd\" (UniqueName: \"kubernetes.io/projected/ad80e1a9-75dc-4860-9bd9-d59b0c0ae43c-kube-api-access-45bbd\") pod \"community-operators-4gwcw\" (UID: \"ad80e1a9-75dc-4860-9bd9-d59b0c0ae43c\") " pod="openshift-marketplace/community-operators-4gwcw" Mar 16 00:10:42 crc kubenswrapper[4816]: I0316 00:10:42.562380 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ckvwn\" (UID: \"b155133b-d494-44bc-aa5d-23efc7cbd7a6\") " pod="openshift-image-registry/image-registry-697d97f7c8-ckvwn" Mar 16 00:10:42 crc kubenswrapper[4816]: I0316 00:10:42.562418 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/46c3b4df-48c2-4131-8ac4-ea6276d70d54-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"46c3b4df-48c2-4131-8ac4-ea6276d70d54\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 16 00:10:42 crc kubenswrapper[4816]: I0316 00:10:42.562471 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ad80e1a9-75dc-4860-9bd9-d59b0c0ae43c-utilities\") pod \"community-operators-4gwcw\" (UID: \"ad80e1a9-75dc-4860-9bd9-d59b0c0ae43c\") " pod="openshift-marketplace/community-operators-4gwcw" Mar 16 00:10:42 crc kubenswrapper[4816]: I0316 00:10:42.562487 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ad80e1a9-75dc-4860-9bd9-d59b0c0ae43c-catalog-content\") pod \"community-operators-4gwcw\" (UID: \"ad80e1a9-75dc-4860-9bd9-d59b0c0ae43c\") " pod="openshift-marketplace/community-operators-4gwcw" Mar 16 00:10:42 crc kubenswrapper[4816]: I0316 00:10:42.562509 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/46c3b4df-48c2-4131-8ac4-ea6276d70d54-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"46c3b4df-48c2-4131-8ac4-ea6276d70d54\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 16 00:10:42 crc kubenswrapper[4816]: E0316 00:10:42.563017 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-16 00:10:43.063007016 +0000 UTC m=+236.159306969 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ckvwn" (UID: "b155133b-d494-44bc-aa5d-23efc7cbd7a6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:42 crc kubenswrapper[4816]: I0316 00:10:42.563058 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/46c3b4df-48c2-4131-8ac4-ea6276d70d54-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"46c3b4df-48c2-4131-8ac4-ea6276d70d54\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 16 00:10:42 crc kubenswrapper[4816]: I0316 00:10:42.563444 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ad80e1a9-75dc-4860-9bd9-d59b0c0ae43c-catalog-content\") pod \"community-operators-4gwcw\" (UID: \"ad80e1a9-75dc-4860-9bd9-d59b0c0ae43c\") " pod="openshift-marketplace/community-operators-4gwcw" Mar 16 00:10:42 crc kubenswrapper[4816]: I0316 00:10:42.563492 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ad80e1a9-75dc-4860-9bd9-d59b0c0ae43c-utilities\") pod \"community-operators-4gwcw\" (UID: \"ad80e1a9-75dc-4860-9bd9-d59b0c0ae43c\") " pod="openshift-marketplace/community-operators-4gwcw" Mar 16 00:10:42 crc kubenswrapper[4816]: I0316 00:10:42.612273 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-wh2h7"] Mar 16 00:10:42 crc kubenswrapper[4816]: I0316 00:10:42.615293 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-wh2h7" Mar 16 00:10:42 crc kubenswrapper[4816]: I0316 00:10:42.620780 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-45bbd\" (UniqueName: \"kubernetes.io/projected/ad80e1a9-75dc-4860-9bd9-d59b0c0ae43c-kube-api-access-45bbd\") pod \"community-operators-4gwcw\" (UID: \"ad80e1a9-75dc-4860-9bd9-d59b0c0ae43c\") " pod="openshift-marketplace/community-operators-4gwcw" Mar 16 00:10:42 crc kubenswrapper[4816]: I0316 00:10:42.621139 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Mar 16 00:10:42 crc kubenswrapper[4816]: I0316 00:10:42.637285 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/46c3b4df-48c2-4131-8ac4-ea6276d70d54-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"46c3b4df-48c2-4131-8ac4-ea6276d70d54\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 16 00:10:42 crc kubenswrapper[4816]: I0316 00:10:42.650343 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-wh2h7"] Mar 16 00:10:42 crc kubenswrapper[4816]: I0316 00:10:42.657894 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 16 00:10:42 crc kubenswrapper[4816]: I0316 00:10:42.665288 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-4gwcw" Mar 16 00:10:42 crc kubenswrapper[4816]: I0316 00:10:42.666066 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 16 00:10:42 crc kubenswrapper[4816]: E0316 00:10:42.666451 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-16 00:10:43.16643465 +0000 UTC m=+236.262734603 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:42 crc kubenswrapper[4816]: I0316 00:10:42.666472 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Mar 16 00:10:42 crc kubenswrapper[4816]: I0316 00:10:42.667129 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 16 00:10:42 crc kubenswrapper[4816]: I0316 00:10:42.669388 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Mar 16 00:10:42 crc kubenswrapper[4816]: I0316 00:10:42.671893 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Mar 16 00:10:42 crc kubenswrapper[4816]: I0316 00:10:42.672427 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Mar 16 00:10:42 crc kubenswrapper[4816]: I0316 00:10:42.698019 4816 ???:1] "http: TLS handshake error from 192.168.126.11:54988: no serving certificate available for the kubelet" Mar 16 00:10:42 crc kubenswrapper[4816]: I0316 00:10:42.751815 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-bkxpc"] Mar 16 00:10:42 crc kubenswrapper[4816]: I0316 00:10:42.756754 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-bkxpc" Mar 16 00:10:42 crc kubenswrapper[4816]: I0316 00:10:42.762033 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-bkxpc"] Mar 16 00:10:42 crc kubenswrapper[4816]: I0316 00:10:42.768151 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w4mjs\" (UniqueName: \"kubernetes.io/projected/b1b3efd0-cdc0-4973-8077-bcd1ea567bdd-kube-api-access-w4mjs\") pod \"certified-operators-wh2h7\" (UID: \"b1b3efd0-cdc0-4973-8077-bcd1ea567bdd\") " pod="openshift-marketplace/certified-operators-wh2h7" Mar 16 00:10:42 crc kubenswrapper[4816]: I0316 00:10:42.768187 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b1b3efd0-cdc0-4973-8077-bcd1ea567bdd-catalog-content\") pod \"certified-operators-wh2h7\" (UID: \"b1b3efd0-cdc0-4973-8077-bcd1ea567bdd\") " pod="openshift-marketplace/certified-operators-wh2h7" Mar 16 00:10:42 crc kubenswrapper[4816]: I0316 00:10:42.768260 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b1b3efd0-cdc0-4973-8077-bcd1ea567bdd-utilities\") pod \"certified-operators-wh2h7\" (UID: \"b1b3efd0-cdc0-4973-8077-bcd1ea567bdd\") " pod="openshift-marketplace/certified-operators-wh2h7" Mar 16 00:10:42 crc kubenswrapper[4816]: I0316 00:10:42.768306 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ckvwn\" (UID: \"b155133b-d494-44bc-aa5d-23efc7cbd7a6\") " pod="openshift-image-registry/image-registry-697d97f7c8-ckvwn" Mar 16 00:10:42 crc kubenswrapper[4816]: I0316 00:10:42.768336 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/5c90a604-be49-44dc-b350-9df660d8587b-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"5c90a604-be49-44dc-b350-9df660d8587b\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 16 00:10:42 crc kubenswrapper[4816]: I0316 00:10:42.768360 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5c90a604-be49-44dc-b350-9df660d8587b-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"5c90a604-be49-44dc-b350-9df660d8587b\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 16 00:10:42 crc kubenswrapper[4816]: E0316 00:10:42.768728 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-16 00:10:43.268715793 +0000 UTC m=+236.365015746 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ckvwn" (UID: "b155133b-d494-44bc-aa5d-23efc7cbd7a6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:42 crc kubenswrapper[4816]: I0316 00:10:42.871412 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 16 00:10:42 crc kubenswrapper[4816]: I0316 00:10:42.871658 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b1b3efd0-cdc0-4973-8077-bcd1ea567bdd-utilities\") pod \"certified-operators-wh2h7\" (UID: \"b1b3efd0-cdc0-4973-8077-bcd1ea567bdd\") " pod="openshift-marketplace/certified-operators-wh2h7" Mar 16 00:10:42 crc kubenswrapper[4816]: I0316 00:10:42.871694 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ff69863d-13e1-444c-ba61-6d68a509a203-catalog-content\") pod \"community-operators-bkxpc\" (UID: \"ff69863d-13e1-444c-ba61-6d68a509a203\") " pod="openshift-marketplace/community-operators-bkxpc" Mar 16 00:10:42 crc kubenswrapper[4816]: I0316 00:10:42.871750 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/5c90a604-be49-44dc-b350-9df660d8587b-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"5c90a604-be49-44dc-b350-9df660d8587b\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 16 00:10:42 crc kubenswrapper[4816]: I0316 00:10:42.871772 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5c90a604-be49-44dc-b350-9df660d8587b-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"5c90a604-be49-44dc-b350-9df660d8587b\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 16 00:10:42 crc kubenswrapper[4816]: I0316 00:10:42.871810 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p8frh\" (UniqueName: \"kubernetes.io/projected/ff69863d-13e1-444c-ba61-6d68a509a203-kube-api-access-p8frh\") pod \"community-operators-bkxpc\" (UID: \"ff69863d-13e1-444c-ba61-6d68a509a203\") " pod="openshift-marketplace/community-operators-bkxpc" Mar 16 00:10:42 crc kubenswrapper[4816]: I0316 00:10:42.874694 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/5c90a604-be49-44dc-b350-9df660d8587b-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"5c90a604-be49-44dc-b350-9df660d8587b\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 16 00:10:42 crc kubenswrapper[4816]: I0316 00:10:42.875114 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b1b3efd0-cdc0-4973-8077-bcd1ea567bdd-utilities\") pod \"certified-operators-wh2h7\" (UID: \"b1b3efd0-cdc0-4973-8077-bcd1ea567bdd\") " pod="openshift-marketplace/certified-operators-wh2h7" Mar 16 00:10:42 crc kubenswrapper[4816]: I0316 00:10:42.877683 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w4mjs\" (UniqueName: \"kubernetes.io/projected/b1b3efd0-cdc0-4973-8077-bcd1ea567bdd-kube-api-access-w4mjs\") pod \"certified-operators-wh2h7\" (UID: \"b1b3efd0-cdc0-4973-8077-bcd1ea567bdd\") " pod="openshift-marketplace/certified-operators-wh2h7" Mar 16 00:10:42 crc kubenswrapper[4816]: I0316 00:10:42.877723 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ff69863d-13e1-444c-ba61-6d68a509a203-utilities\") pod \"community-operators-bkxpc\" (UID: \"ff69863d-13e1-444c-ba61-6d68a509a203\") " pod="openshift-marketplace/community-operators-bkxpc" Mar 16 00:10:42 crc kubenswrapper[4816]: I0316 00:10:42.877741 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b1b3efd0-cdc0-4973-8077-bcd1ea567bdd-catalog-content\") pod \"certified-operators-wh2h7\" (UID: \"b1b3efd0-cdc0-4973-8077-bcd1ea567bdd\") " pod="openshift-marketplace/certified-operators-wh2h7" Mar 16 00:10:42 crc kubenswrapper[4816]: I0316 00:10:42.878129 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b1b3efd0-cdc0-4973-8077-bcd1ea567bdd-catalog-content\") pod \"certified-operators-wh2h7\" (UID: \"b1b3efd0-cdc0-4973-8077-bcd1ea567bdd\") " pod="openshift-marketplace/certified-operators-wh2h7" Mar 16 00:10:42 crc kubenswrapper[4816]: E0316 00:10:42.878492 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-16 00:10:43.378469149 +0000 UTC m=+236.474769102 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:42 crc kubenswrapper[4816]: I0316 00:10:42.901534 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w4mjs\" (UniqueName: \"kubernetes.io/projected/b1b3efd0-cdc0-4973-8077-bcd1ea567bdd-kube-api-access-w4mjs\") pod \"certified-operators-wh2h7\" (UID: \"b1b3efd0-cdc0-4973-8077-bcd1ea567bdd\") " pod="openshift-marketplace/certified-operators-wh2h7" Mar 16 00:10:42 crc kubenswrapper[4816]: I0316 00:10:42.902140 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5c90a604-be49-44dc-b350-9df660d8587b-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"5c90a604-be49-44dc-b350-9df660d8587b\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 16 00:10:42 crc kubenswrapper[4816]: I0316 00:10:42.931364 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-8q2xw"] Mar 16 00:10:42 crc kubenswrapper[4816]: I0316 00:10:42.932395 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-8q2xw" Mar 16 00:10:42 crc kubenswrapper[4816]: I0316 00:10:42.946314 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-8q2xw"] Mar 16 00:10:42 crc kubenswrapper[4816]: I0316 00:10:42.953504 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Mar 16 00:10:42 crc kubenswrapper[4816]: I0316 00:10:42.978772 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ff69863d-13e1-444c-ba61-6d68a509a203-catalog-content\") pod \"community-operators-bkxpc\" (UID: \"ff69863d-13e1-444c-ba61-6d68a509a203\") " pod="openshift-marketplace/community-operators-bkxpc" Mar 16 00:10:42 crc kubenswrapper[4816]: I0316 00:10:42.978835 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ckvwn\" (UID: \"b155133b-d494-44bc-aa5d-23efc7cbd7a6\") " pod="openshift-image-registry/image-registry-697d97f7c8-ckvwn" Mar 16 00:10:42 crc kubenswrapper[4816]: I0316 00:10:42.978899 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p8frh\" (UniqueName: \"kubernetes.io/projected/ff69863d-13e1-444c-ba61-6d68a509a203-kube-api-access-p8frh\") pod \"community-operators-bkxpc\" (UID: \"ff69863d-13e1-444c-ba61-6d68a509a203\") " pod="openshift-marketplace/community-operators-bkxpc" Mar 16 00:10:42 crc kubenswrapper[4816]: I0316 00:10:42.978947 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ff69863d-13e1-444c-ba61-6d68a509a203-utilities\") pod \"community-operators-bkxpc\" (UID: \"ff69863d-13e1-444c-ba61-6d68a509a203\") " pod="openshift-marketplace/community-operators-bkxpc" Mar 16 00:10:42 crc kubenswrapper[4816]: E0316 00:10:42.979250 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-16 00:10:43.4792288 +0000 UTC m=+236.575528803 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ckvwn" (UID: "b155133b-d494-44bc-aa5d-23efc7cbd7a6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:42 crc kubenswrapper[4816]: I0316 00:10:42.979358 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ff69863d-13e1-444c-ba61-6d68a509a203-utilities\") pod \"community-operators-bkxpc\" (UID: \"ff69863d-13e1-444c-ba61-6d68a509a203\") " pod="openshift-marketplace/community-operators-bkxpc" Mar 16 00:10:42 crc kubenswrapper[4816]: I0316 00:10:42.979843 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ff69863d-13e1-444c-ba61-6d68a509a203-catalog-content\") pod \"community-operators-bkxpc\" (UID: \"ff69863d-13e1-444c-ba61-6d68a509a203\") " pod="openshift-marketplace/community-operators-bkxpc" Mar 16 00:10:42 crc kubenswrapper[4816]: W0316 00:10:42.991796 4816 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod46c3b4df_48c2_4131_8ac4_ea6276d70d54.slice/crio-2c1fef6767b0e1fca9d9b3150317fb66c123ea69c5309a134a4ad450d4919157 WatchSource:0}: Error finding container 2c1fef6767b0e1fca9d9b3150317fb66c123ea69c5309a134a4ad450d4919157: Status 404 returned error can't find the container with id 2c1fef6767b0e1fca9d9b3150317fb66c123ea69c5309a134a4ad450d4919157 Mar 16 00:10:43 crc kubenswrapper[4816]: I0316 00:10:43.010810 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p8frh\" (UniqueName: \"kubernetes.io/projected/ff69863d-13e1-444c-ba61-6d68a509a203-kube-api-access-p8frh\") pod \"community-operators-bkxpc\" (UID: \"ff69863d-13e1-444c-ba61-6d68a509a203\") " pod="openshift-marketplace/community-operators-bkxpc" Mar 16 00:10:43 crc kubenswrapper[4816]: I0316 00:10:43.016929 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-wh2h7" Mar 16 00:10:43 crc kubenswrapper[4816]: I0316 00:10:43.033518 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-4gwcw"] Mar 16 00:10:43 crc kubenswrapper[4816]: W0316 00:10:43.045262 4816 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podad80e1a9_75dc_4860_9bd9_d59b0c0ae43c.slice/crio-661437598c338aed0d5a7d52e67330434003899adaefd998268791f6175ab8ca WatchSource:0}: Error finding container 661437598c338aed0d5a7d52e67330434003899adaefd998268791f6175ab8ca: Status 404 returned error can't find the container with id 661437598c338aed0d5a7d52e67330434003899adaefd998268791f6175ab8ca Mar 16 00:10:43 crc kubenswrapper[4816]: I0316 00:10:43.070542 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 16 00:10:43 crc kubenswrapper[4816]: I0316 00:10:43.074309 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-config-operator/openshift-config-operator-7777fb866f-d4vrm" Mar 16 00:10:43 crc kubenswrapper[4816]: I0316 00:10:43.079926 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 16 00:10:43 crc kubenswrapper[4816]: I0316 00:10:43.080085 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8gwq2\" (UniqueName: \"kubernetes.io/projected/bf586c6e-f957-46fc-8140-f9a9ea22510f-kube-api-access-8gwq2\") pod \"certified-operators-8q2xw\" (UID: \"bf586c6e-f957-46fc-8140-f9a9ea22510f\") " pod="openshift-marketplace/certified-operators-8q2xw" Mar 16 00:10:43 crc kubenswrapper[4816]: I0316 00:10:43.080149 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bf586c6e-f957-46fc-8140-f9a9ea22510f-utilities\") pod \"certified-operators-8q2xw\" (UID: \"bf586c6e-f957-46fc-8140-f9a9ea22510f\") " pod="openshift-marketplace/certified-operators-8q2xw" Mar 16 00:10:43 crc kubenswrapper[4816]: I0316 00:10:43.080176 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bf586c6e-f957-46fc-8140-f9a9ea22510f-catalog-content\") pod \"certified-operators-8q2xw\" (UID: \"bf586c6e-f957-46fc-8140-f9a9ea22510f\") " pod="openshift-marketplace/certified-operators-8q2xw" Mar 16 00:10:43 crc kubenswrapper[4816]: E0316 00:10:43.080284 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-16 00:10:43.580271399 +0000 UTC m=+236.676571352 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:43 crc kubenswrapper[4816]: I0316 00:10:43.086730 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-bkxpc" Mar 16 00:10:43 crc kubenswrapper[4816]: I0316 00:10:43.182033 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8gwq2\" (UniqueName: \"kubernetes.io/projected/bf586c6e-f957-46fc-8140-f9a9ea22510f-kube-api-access-8gwq2\") pod \"certified-operators-8q2xw\" (UID: \"bf586c6e-f957-46fc-8140-f9a9ea22510f\") " pod="openshift-marketplace/certified-operators-8q2xw" Mar 16 00:10:43 crc kubenswrapper[4816]: I0316 00:10:43.182452 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ckvwn\" (UID: \"b155133b-d494-44bc-aa5d-23efc7cbd7a6\") " pod="openshift-image-registry/image-registry-697d97f7c8-ckvwn" Mar 16 00:10:43 crc kubenswrapper[4816]: I0316 00:10:43.182496 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bf586c6e-f957-46fc-8140-f9a9ea22510f-utilities\") pod \"certified-operators-8q2xw\" (UID: \"bf586c6e-f957-46fc-8140-f9a9ea22510f\") " pod="openshift-marketplace/certified-operators-8q2xw" Mar 16 00:10:43 crc kubenswrapper[4816]: I0316 00:10:43.182518 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bf586c6e-f957-46fc-8140-f9a9ea22510f-catalog-content\") pod \"certified-operators-8q2xw\" (UID: \"bf586c6e-f957-46fc-8140-f9a9ea22510f\") " pod="openshift-marketplace/certified-operators-8q2xw" Mar 16 00:10:43 crc kubenswrapper[4816]: E0316 00:10:43.182932 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-16 00:10:43.682919472 +0000 UTC m=+236.779219425 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ckvwn" (UID: "b155133b-d494-44bc-aa5d-23efc7cbd7a6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:43 crc kubenswrapper[4816]: I0316 00:10:43.229953 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bf586c6e-f957-46fc-8140-f9a9ea22510f-catalog-content\") pod \"certified-operators-8q2xw\" (UID: \"bf586c6e-f957-46fc-8140-f9a9ea22510f\") " pod="openshift-marketplace/certified-operators-8q2xw" Mar 16 00:10:43 crc kubenswrapper[4816]: I0316 00:10:43.239904 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8gwq2\" (UniqueName: \"kubernetes.io/projected/bf586c6e-f957-46fc-8140-f9a9ea22510f-kube-api-access-8gwq2\") pod \"certified-operators-8q2xw\" (UID: \"bf586c6e-f957-46fc-8140-f9a9ea22510f\") " pod="openshift-marketplace/certified-operators-8q2xw" Mar 16 00:10:43 crc kubenswrapper[4816]: I0316 00:10:43.245788 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bf586c6e-f957-46fc-8140-f9a9ea22510f-utilities\") pod \"certified-operators-8q2xw\" (UID: \"bf586c6e-f957-46fc-8140-f9a9ea22510f\") " pod="openshift-marketplace/certified-operators-8q2xw" Mar 16 00:10:43 crc kubenswrapper[4816]: I0316 00:10:43.271201 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-8q2xw" Mar 16 00:10:43 crc kubenswrapper[4816]: I0316 00:10:43.280781 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-wh2h7"] Mar 16 00:10:43 crc kubenswrapper[4816]: I0316 00:10:43.284118 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 16 00:10:43 crc kubenswrapper[4816]: E0316 00:10:43.284488 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-16 00:10:43.784473494 +0000 UTC m=+236.880773447 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:43 crc kubenswrapper[4816]: I0316 00:10:43.374174 4816 generic.go:334] "Generic (PLEG): container finished" podID="1d5466ab-a589-4f7e-ae89-2f494b10f6b1" containerID="e90fdfac87f05e45b64d63ce5cb4d5902fbd18d9c1d580577069351527db0c29" exitCode=0 Mar 16 00:10:43 crc kubenswrapper[4816]: I0316 00:10:43.374220 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-d9j8j" event={"ID":"1d5466ab-a589-4f7e-ae89-2f494b10f6b1","Type":"ContainerDied","Data":"e90fdfac87f05e45b64d63ce5cb4d5902fbd18d9c1d580577069351527db0c29"} Mar 16 00:10:43 crc kubenswrapper[4816]: I0316 00:10:43.375749 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4gwcw" event={"ID":"ad80e1a9-75dc-4860-9bd9-d59b0c0ae43c","Type":"ContainerStarted","Data":"661437598c338aed0d5a7d52e67330434003899adaefd998268791f6175ab8ca"} Mar 16 00:10:43 crc kubenswrapper[4816]: I0316 00:10:43.382769 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wh2h7" event={"ID":"b1b3efd0-cdc0-4973-8077-bcd1ea567bdd","Type":"ContainerStarted","Data":"0e89bdbfb4ed11608191b3360966bdeb2f13767d41154d3097545518437bcaec"} Mar 16 00:10:43 crc kubenswrapper[4816]: I0316 00:10:43.385068 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"46c3b4df-48c2-4131-8ac4-ea6276d70d54","Type":"ContainerStarted","Data":"2c1fef6767b0e1fca9d9b3150317fb66c123ea69c5309a134a4ad450d4919157"} Mar 16 00:10:43 crc kubenswrapper[4816]: I0316 00:10:43.385393 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ckvwn\" (UID: \"b155133b-d494-44bc-aa5d-23efc7cbd7a6\") " pod="openshift-image-registry/image-registry-697d97f7c8-ckvwn" Mar 16 00:10:43 crc kubenswrapper[4816]: E0316 00:10:43.386422 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-16 00:10:43.885796261 +0000 UTC m=+236.982096214 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ckvwn" (UID: "b155133b-d494-44bc-aa5d-23efc7cbd7a6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:43 crc kubenswrapper[4816]: I0316 00:10:43.386753 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-bkxpc"] Mar 16 00:10:43 crc kubenswrapper[4816]: W0316 00:10:43.396096 4816 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podff69863d_13e1_444c_ba61_6d68a509a203.slice/crio-bb49e18eedefb469d6109a46587df4a9ef4eb1e0a35954df9209a551fbb7b5b4 WatchSource:0}: Error finding container bb49e18eedefb469d6109a46587df4a9ef4eb1e0a35954df9209a551fbb7b5b4: Status 404 returned error can't find the container with id bb49e18eedefb469d6109a46587df4a9ef4eb1e0a35954df9209a551fbb7b5b4 Mar 16 00:10:43 crc kubenswrapper[4816]: I0316 00:10:43.487157 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 16 00:10:43 crc kubenswrapper[4816]: E0316 00:10:43.487516 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-16 00:10:43.987490657 +0000 UTC m=+237.083790610 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:43 crc kubenswrapper[4816]: I0316 00:10:43.487796 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ckvwn\" (UID: \"b155133b-d494-44bc-aa5d-23efc7cbd7a6\") " pod="openshift-image-registry/image-registry-697d97f7c8-ckvwn" Mar 16 00:10:43 crc kubenswrapper[4816]: E0316 00:10:43.488502 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-16 00:10:43.988480874 +0000 UTC m=+237.084780827 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ckvwn" (UID: "b155133b-d494-44bc-aa5d-23efc7cbd7a6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:43 crc kubenswrapper[4816]: I0316 00:10:43.530785 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-8q2xw"] Mar 16 00:10:43 crc kubenswrapper[4816]: W0316 00:10:43.538283 4816 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbf586c6e_f957_46fc_8140_f9a9ea22510f.slice/crio-e4ae76a3c7fcca7fb114ac9afc90c35f9554a0225d9ad44974098c92d5909906 WatchSource:0}: Error finding container e4ae76a3c7fcca7fb114ac9afc90c35f9554a0225d9ad44974098c92d5909906: Status 404 returned error can't find the container with id e4ae76a3c7fcca7fb114ac9afc90c35f9554a0225d9ad44974098c92d5909906 Mar 16 00:10:43 crc kubenswrapper[4816]: I0316 00:10:43.556769 4816 patch_prober.go:28] interesting pod/router-default-5444994796-gvk75 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 16 00:10:43 crc kubenswrapper[4816]: [-]has-synced failed: reason withheld Mar 16 00:10:43 crc kubenswrapper[4816]: [+]process-running ok Mar 16 00:10:43 crc kubenswrapper[4816]: healthz check failed Mar 16 00:10:43 crc kubenswrapper[4816]: I0316 00:10:43.556821 4816 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-gvk75" podUID="681ca8e4-f909-4e8b-9f35-5ab8ca382e44" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 16 00:10:43 crc kubenswrapper[4816]: I0316 00:10:43.589070 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 16 00:10:43 crc kubenswrapper[4816]: E0316 00:10:43.589530 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-16 00:10:44.089513503 +0000 UTC m=+237.185813456 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:43 crc kubenswrapper[4816]: I0316 00:10:43.667659 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Mar 16 00:10:43 crc kubenswrapper[4816]: I0316 00:10:43.690625 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ckvwn\" (UID: \"b155133b-d494-44bc-aa5d-23efc7cbd7a6\") " pod="openshift-image-registry/image-registry-697d97f7c8-ckvwn" Mar 16 00:10:43 crc kubenswrapper[4816]: E0316 00:10:43.691131 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-16 00:10:44.191094406 +0000 UTC m=+237.287394369 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ckvwn" (UID: "b155133b-d494-44bc-aa5d-23efc7cbd7a6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:43 crc kubenswrapper[4816]: W0316 00:10:43.737877 4816 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod5c90a604_be49_44dc_b350_9df660d8587b.slice/crio-f6dda169be5b3a08dfad7ad03016a7926ed8e41ada3f5914164856f0c44926f0 WatchSource:0}: Error finding container f6dda169be5b3a08dfad7ad03016a7926ed8e41ada3f5914164856f0c44926f0: Status 404 returned error can't find the container with id f6dda169be5b3a08dfad7ad03016a7926ed8e41ada3f5914164856f0c44926f0 Mar 16 00:10:43 crc kubenswrapper[4816]: I0316 00:10:43.791294 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 16 00:10:43 crc kubenswrapper[4816]: E0316 00:10:43.791747 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-16 00:10:44.291727624 +0000 UTC m=+237.388027577 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:43 crc kubenswrapper[4816]: I0316 00:10:43.907237 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ckvwn\" (UID: \"b155133b-d494-44bc-aa5d-23efc7cbd7a6\") " pod="openshift-image-registry/image-registry-697d97f7c8-ckvwn" Mar 16 00:10:43 crc kubenswrapper[4816]: E0316 00:10:43.907638 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-16 00:10:44.407623188 +0000 UTC m=+237.503923161 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ckvwn" (UID: "b155133b-d494-44bc-aa5d-23efc7cbd7a6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:43 crc kubenswrapper[4816]: I0316 00:10:43.948925 4816 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-apiserver/apiserver-76f77b778f-pdm8d" Mar 16 00:10:43 crc kubenswrapper[4816]: I0316 00:10:43.948963 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-apiserver/apiserver-76f77b778f-pdm8d" Mar 16 00:10:44 crc kubenswrapper[4816]: I0316 00:10:44.008662 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 16 00:10:44 crc kubenswrapper[4816]: E0316 00:10:44.008825 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-16 00:10:44.508806111 +0000 UTC m=+237.605106064 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:44 crc kubenswrapper[4816]: I0316 00:10:44.008981 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ckvwn\" (UID: \"b155133b-d494-44bc-aa5d-23efc7cbd7a6\") " pod="openshift-image-registry/image-registry-697d97f7c8-ckvwn" Mar 16 00:10:44 crc kubenswrapper[4816]: E0316 00:10:44.009319 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-16 00:10:44.509309385 +0000 UTC m=+237.605609338 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ckvwn" (UID: "b155133b-d494-44bc-aa5d-23efc7cbd7a6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:44 crc kubenswrapper[4816]: I0316 00:10:44.064852 4816 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zrq8d" Mar 16 00:10:44 crc kubenswrapper[4816]: I0316 00:10:44.073612 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zrq8d" Mar 16 00:10:44 crc kubenswrapper[4816]: I0316 00:10:44.110829 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 16 00:10:44 crc kubenswrapper[4816]: E0316 00:10:44.112588 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-16 00:10:44.612532392 +0000 UTC m=+237.708832345 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:44 crc kubenswrapper[4816]: I0316 00:10:44.212628 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ckvwn\" (UID: \"b155133b-d494-44bc-aa5d-23efc7cbd7a6\") " pod="openshift-image-registry/image-registry-697d97f7c8-ckvwn" Mar 16 00:10:44 crc kubenswrapper[4816]: E0316 00:10:44.213163 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-16 00:10:44.713145449 +0000 UTC m=+237.809445402 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ckvwn" (UID: "b155133b-d494-44bc-aa5d-23efc7cbd7a6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:44 crc kubenswrapper[4816]: I0316 00:10:44.313835 4816 patch_prober.go:28] interesting pod/apiserver-76f77b778f-pdm8d container/openshift-apiserver namespace/openshift-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Mar 16 00:10:44 crc kubenswrapper[4816]: [+]log ok Mar 16 00:10:44 crc kubenswrapper[4816]: [+]etcd ok Mar 16 00:10:44 crc kubenswrapper[4816]: [+]poststarthook/start-apiserver-admission-initializer ok Mar 16 00:10:44 crc kubenswrapper[4816]: [+]poststarthook/generic-apiserver-start-informers ok Mar 16 00:10:44 crc kubenswrapper[4816]: [+]poststarthook/max-in-flight-filter ok Mar 16 00:10:44 crc kubenswrapper[4816]: [+]poststarthook/storage-object-count-tracker-hook ok Mar 16 00:10:44 crc kubenswrapper[4816]: [+]poststarthook/image.openshift.io-apiserver-caches ok Mar 16 00:10:44 crc kubenswrapper[4816]: [-]poststarthook/authorization.openshift.io-bootstrapclusterroles failed: reason withheld Mar 16 00:10:44 crc kubenswrapper[4816]: [-]poststarthook/authorization.openshift.io-ensurenodebootstrap-sa failed: reason withheld Mar 16 00:10:44 crc kubenswrapper[4816]: [+]poststarthook/project.openshift.io-projectcache ok Mar 16 00:10:44 crc kubenswrapper[4816]: [+]poststarthook/project.openshift.io-projectauthorizationcache ok Mar 16 00:10:44 crc kubenswrapper[4816]: [+]poststarthook/openshift.io-startinformers ok Mar 16 00:10:44 crc kubenswrapper[4816]: [+]poststarthook/openshift.io-restmapperupdater ok Mar 16 00:10:44 crc kubenswrapper[4816]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Mar 16 00:10:44 crc kubenswrapper[4816]: livez check failed Mar 16 00:10:44 crc kubenswrapper[4816]: I0316 00:10:44.313942 4816 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-apiserver/apiserver-76f77b778f-pdm8d" podUID="1f26ea52-1f97-4d4a-98bd-897c5b3b88c5" containerName="openshift-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 16 00:10:44 crc kubenswrapper[4816]: I0316 00:10:44.314078 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 16 00:10:44 crc kubenswrapper[4816]: E0316 00:10:44.314633 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-16 00:10:44.814615889 +0000 UTC m=+237.910915842 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:44 crc kubenswrapper[4816]: I0316 00:10:44.338001 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-7pb49"] Mar 16 00:10:44 crc kubenswrapper[4816]: I0316 00:10:44.339152 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-7pb49" Mar 16 00:10:44 crc kubenswrapper[4816]: I0316 00:10:44.342218 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Mar 16 00:10:44 crc kubenswrapper[4816]: I0316 00:10:44.346646 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-7pb49"] Mar 16 00:10:44 crc kubenswrapper[4816]: I0316 00:10:44.361013 4816 patch_prober.go:28] interesting pod/route-controller-manager-6576b87f9c-d9j8j container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.11:8443/healthz\": dial tcp 10.217.0.11:8443: connect: connection refused" start-of-body= Mar 16 00:10:44 crc kubenswrapper[4816]: I0316 00:10:44.361064 4816 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-d9j8j" podUID="1d5466ab-a589-4f7e-ae89-2f494b10f6b1" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.11:8443/healthz\": dial tcp 10.217.0.11:8443: connect: connection refused" Mar 16 00:10:44 crc kubenswrapper[4816]: I0316 00:10:44.415433 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j24xj\" (UniqueName: \"kubernetes.io/projected/a5ba22dd-8e8e-4beb-a540-e5c9687810b8-kube-api-access-j24xj\") pod \"redhat-marketplace-7pb49\" (UID: \"a5ba22dd-8e8e-4beb-a540-e5c9687810b8\") " pod="openshift-marketplace/redhat-marketplace-7pb49" Mar 16 00:10:44 crc kubenswrapper[4816]: I0316 00:10:44.415480 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a5ba22dd-8e8e-4beb-a540-e5c9687810b8-catalog-content\") pod \"redhat-marketplace-7pb49\" (UID: \"a5ba22dd-8e8e-4beb-a540-e5c9687810b8\") " pod="openshift-marketplace/redhat-marketplace-7pb49" Mar 16 00:10:44 crc kubenswrapper[4816]: I0316 00:10:44.415586 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ckvwn\" (UID: \"b155133b-d494-44bc-aa5d-23efc7cbd7a6\") " pod="openshift-image-registry/image-registry-697d97f7c8-ckvwn" Mar 16 00:10:44 crc kubenswrapper[4816]: I0316 00:10:44.415639 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a5ba22dd-8e8e-4beb-a540-e5c9687810b8-utilities\") pod \"redhat-marketplace-7pb49\" (UID: \"a5ba22dd-8e8e-4beb-a540-e5c9687810b8\") " pod="openshift-marketplace/redhat-marketplace-7pb49" Mar 16 00:10:44 crc kubenswrapper[4816]: E0316 00:10:44.415926 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-16 00:10:44.915911305 +0000 UTC m=+238.012211258 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ckvwn" (UID: "b155133b-d494-44bc-aa5d-23efc7cbd7a6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:44 crc kubenswrapper[4816]: I0316 00:10:44.419299 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-f9d7485db-nnqsw" Mar 16 00:10:44 crc kubenswrapper[4816]: I0316 00:10:44.419340 4816 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-f9d7485db-nnqsw" Mar 16 00:10:44 crc kubenswrapper[4816]: I0316 00:10:44.424391 4816 patch_prober.go:28] interesting pod/console-f9d7485db-nnqsw container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.5:8443/health\": dial tcp 10.217.0.5:8443: connect: connection refused" start-of-body= Mar 16 00:10:44 crc kubenswrapper[4816]: I0316 00:10:44.424452 4816 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-nnqsw" podUID="32cac2d0-56f0-4ba8-86dc-9b57c0fcc11c" containerName="console" probeResult="failure" output="Get \"https://10.217.0.5:8443/health\": dial tcp 10.217.0.5:8443: connect: connection refused" Mar 16 00:10:44 crc kubenswrapper[4816]: I0316 00:10:44.434631 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"5c90a604-be49-44dc-b350-9df660d8587b","Type":"ContainerStarted","Data":"de4df241bf0429f4c7d3687854c459dc61daeb2dd35192a1c0611d80c7988415"} Mar 16 00:10:44 crc kubenswrapper[4816]: I0316 00:10:44.434682 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"5c90a604-be49-44dc-b350-9df660d8587b","Type":"ContainerStarted","Data":"f6dda169be5b3a08dfad7ad03016a7926ed8e41ada3f5914164856f0c44926f0"} Mar 16 00:10:44 crc kubenswrapper[4816]: I0316 00:10:44.437402 4816 generic.go:334] "Generic (PLEG): container finished" podID="59c840a8-f288-44ed-83d3-34d47041c6c6" containerID="c46a7076608f889c8e30b77b33715ed49c92e64799e40fe88b9f99f6e980f6a5" exitCode=0 Mar 16 00:10:44 crc kubenswrapper[4816]: I0316 00:10:44.437459 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-tv2n7" event={"ID":"59c840a8-f288-44ed-83d3-34d47041c6c6","Type":"ContainerDied","Data":"c46a7076608f889c8e30b77b33715ed49c92e64799e40fe88b9f99f6e980f6a5"} Mar 16 00:10:44 crc kubenswrapper[4816]: I0316 00:10:44.438676 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"46c3b4df-48c2-4131-8ac4-ea6276d70d54","Type":"ContainerStarted","Data":"cf91bdab1d487b3e9c82088df50a44f305fd300ad681fc9e9b0b6cb01f350748"} Mar 16 00:10:44 crc kubenswrapper[4816]: I0316 00:10:44.440611 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-trp9l" event={"ID":"a1a13dfd-7f8c-4a52-9ace-ffb14d3d91f2","Type":"ContainerStarted","Data":"4aa6683570021372f0c3ac10aa502521835a45064050e765d6899b5fc0d01fa9"} Mar 16 00:10:44 crc kubenswrapper[4816]: I0316 00:10:44.441915 4816 generic.go:334] "Generic (PLEG): container finished" podID="ad80e1a9-75dc-4860-9bd9-d59b0c0ae43c" containerID="16a79ba542cf5cd202f64693662a992a86e39b69458f390ed8cb6f7dbabffd89" exitCode=0 Mar 16 00:10:44 crc kubenswrapper[4816]: I0316 00:10:44.441962 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4gwcw" event={"ID":"ad80e1a9-75dc-4860-9bd9-d59b0c0ae43c","Type":"ContainerDied","Data":"16a79ba542cf5cd202f64693662a992a86e39b69458f390ed8cb6f7dbabffd89"} Mar 16 00:10:44 crc kubenswrapper[4816]: I0316 00:10:44.444313 4816 generic.go:334] "Generic (PLEG): container finished" podID="ff69863d-13e1-444c-ba61-6d68a509a203" containerID="2d7e1ead92ce8010c6084321e28f13a3b17186a0141a9a086a18947183a41d47" exitCode=0 Mar 16 00:10:44 crc kubenswrapper[4816]: I0316 00:10:44.444449 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bkxpc" event={"ID":"ff69863d-13e1-444c-ba61-6d68a509a203","Type":"ContainerDied","Data":"2d7e1ead92ce8010c6084321e28f13a3b17186a0141a9a086a18947183a41d47"} Mar 16 00:10:44 crc kubenswrapper[4816]: I0316 00:10:44.444480 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bkxpc" event={"ID":"ff69863d-13e1-444c-ba61-6d68a509a203","Type":"ContainerStarted","Data":"bb49e18eedefb469d6109a46587df4a9ef4eb1e0a35954df9209a551fbb7b5b4"} Mar 16 00:10:44 crc kubenswrapper[4816]: I0316 00:10:44.456097 4816 generic.go:334] "Generic (PLEG): container finished" podID="9397185e-a9e3-4ef4-b0be-d9dc9208adff" containerID="a26a0a4314d400c57bc06c18480ab7a501ebc981f4b8dbd60334dd3390aec49c" exitCode=0 Mar 16 00:10:44 crc kubenswrapper[4816]: I0316 00:10:44.456251 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29560320-4hk5d" event={"ID":"9397185e-a9e3-4ef4-b0be-d9dc9208adff","Type":"ContainerDied","Data":"a26a0a4314d400c57bc06c18480ab7a501ebc981f4b8dbd60334dd3390aec49c"} Mar 16 00:10:44 crc kubenswrapper[4816]: I0316 00:10:44.465437 4816 generic.go:334] "Generic (PLEG): container finished" podID="bf586c6e-f957-46fc-8140-f9a9ea22510f" containerID="307037ba13f42f68192fcc6d4406e472de7d9aac5f7546be49cd42537db26240" exitCode=0 Mar 16 00:10:44 crc kubenswrapper[4816]: I0316 00:10:44.465517 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8q2xw" event={"ID":"bf586c6e-f957-46fc-8140-f9a9ea22510f","Type":"ContainerDied","Data":"307037ba13f42f68192fcc6d4406e472de7d9aac5f7546be49cd42537db26240"} Mar 16 00:10:44 crc kubenswrapper[4816]: I0316 00:10:44.465539 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8q2xw" event={"ID":"bf586c6e-f957-46fc-8140-f9a9ea22510f","Type":"ContainerStarted","Data":"e4ae76a3c7fcca7fb114ac9afc90c35f9554a0225d9ad44974098c92d5909906"} Mar 16 00:10:44 crc kubenswrapper[4816]: I0316 00:10:44.481700 4816 generic.go:334] "Generic (PLEG): container finished" podID="b1b3efd0-cdc0-4973-8077-bcd1ea567bdd" containerID="c898ff116eae4b6d21df6664b26fd09b1827c5eec1cd2e0032f6fcd35a691638" exitCode=0 Mar 16 00:10:44 crc kubenswrapper[4816]: I0316 00:10:44.481856 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wh2h7" event={"ID":"b1b3efd0-cdc0-4973-8077-bcd1ea567bdd","Type":"ContainerDied","Data":"c898ff116eae4b6d21df6664b26fd09b1827c5eec1cd2e0032f6fcd35a691638"} Mar 16 00:10:44 crc kubenswrapper[4816]: I0316 00:10:44.516274 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 16 00:10:44 crc kubenswrapper[4816]: E0316 00:10:44.516435 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-16 00:10:45.016413289 +0000 UTC m=+238.112713252 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:44 crc kubenswrapper[4816]: I0316 00:10:44.516499 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ckvwn\" (UID: \"b155133b-d494-44bc-aa5d-23efc7cbd7a6\") " pod="openshift-image-registry/image-registry-697d97f7c8-ckvwn" Mar 16 00:10:44 crc kubenswrapper[4816]: I0316 00:10:44.516596 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a5ba22dd-8e8e-4beb-a540-e5c9687810b8-utilities\") pod \"redhat-marketplace-7pb49\" (UID: \"a5ba22dd-8e8e-4beb-a540-e5c9687810b8\") " pod="openshift-marketplace/redhat-marketplace-7pb49" Mar 16 00:10:44 crc kubenswrapper[4816]: I0316 00:10:44.516705 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j24xj\" (UniqueName: \"kubernetes.io/projected/a5ba22dd-8e8e-4beb-a540-e5c9687810b8-kube-api-access-j24xj\") pod \"redhat-marketplace-7pb49\" (UID: \"a5ba22dd-8e8e-4beb-a540-e5c9687810b8\") " pod="openshift-marketplace/redhat-marketplace-7pb49" Mar 16 00:10:44 crc kubenswrapper[4816]: I0316 00:10:44.516729 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a5ba22dd-8e8e-4beb-a540-e5c9687810b8-catalog-content\") pod \"redhat-marketplace-7pb49\" (UID: \"a5ba22dd-8e8e-4beb-a540-e5c9687810b8\") " pod="openshift-marketplace/redhat-marketplace-7pb49" Mar 16 00:10:44 crc kubenswrapper[4816]: E0316 00:10:44.516903 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-16 00:10:45.016886572 +0000 UTC m=+238.113186525 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ckvwn" (UID: "b155133b-d494-44bc-aa5d-23efc7cbd7a6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:44 crc kubenswrapper[4816]: I0316 00:10:44.517796 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a5ba22dd-8e8e-4beb-a540-e5c9687810b8-utilities\") pod \"redhat-marketplace-7pb49\" (UID: \"a5ba22dd-8e8e-4beb-a540-e5c9687810b8\") " pod="openshift-marketplace/redhat-marketplace-7pb49" Mar 16 00:10:44 crc kubenswrapper[4816]: I0316 00:10:44.518061 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a5ba22dd-8e8e-4beb-a540-e5c9687810b8-catalog-content\") pod \"redhat-marketplace-7pb49\" (UID: \"a5ba22dd-8e8e-4beb-a540-e5c9687810b8\") " pod="openshift-marketplace/redhat-marketplace-7pb49" Mar 16 00:10:44 crc kubenswrapper[4816]: I0316 00:10:44.524409 4816 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock" Mar 16 00:10:44 crc kubenswrapper[4816]: I0316 00:10:44.541314 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j24xj\" (UniqueName: \"kubernetes.io/projected/a5ba22dd-8e8e-4beb-a540-e5c9687810b8-kube-api-access-j24xj\") pod \"redhat-marketplace-7pb49\" (UID: \"a5ba22dd-8e8e-4beb-a540-e5c9687810b8\") " pod="openshift-marketplace/redhat-marketplace-7pb49" Mar 16 00:10:44 crc kubenswrapper[4816]: I0316 00:10:44.550350 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ingress/router-default-5444994796-gvk75" Mar 16 00:10:44 crc kubenswrapper[4816]: I0316 00:10:44.553806 4816 patch_prober.go:28] interesting pod/router-default-5444994796-gvk75 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 16 00:10:44 crc kubenswrapper[4816]: [-]has-synced failed: reason withheld Mar 16 00:10:44 crc kubenswrapper[4816]: [+]process-running ok Mar 16 00:10:44 crc kubenswrapper[4816]: healthz check failed Mar 16 00:10:44 crc kubenswrapper[4816]: I0316 00:10:44.553855 4816 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-gvk75" podUID="681ca8e4-f909-4e8b-9f35-5ab8ca382e44" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 16 00:10:44 crc kubenswrapper[4816]: I0316 00:10:44.618092 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 16 00:10:44 crc kubenswrapper[4816]: E0316 00:10:44.618306 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-16 00:10:45.1182595 +0000 UTC m=+238.214559453 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:44 crc kubenswrapper[4816]: I0316 00:10:44.618926 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ckvwn\" (UID: \"b155133b-d494-44bc-aa5d-23efc7cbd7a6\") " pod="openshift-image-registry/image-registry-697d97f7c8-ckvwn" Mar 16 00:10:44 crc kubenswrapper[4816]: E0316 00:10:44.619672 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-16 00:10:45.119654038 +0000 UTC m=+238.215954081 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ckvwn" (UID: "b155133b-d494-44bc-aa5d-23efc7cbd7a6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:44 crc kubenswrapper[4816]: I0316 00:10:44.654336 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-7pb49" Mar 16 00:10:44 crc kubenswrapper[4816]: I0316 00:10:44.697985 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-d9j8j" Mar 16 00:10:44 crc kubenswrapper[4816]: I0316 00:10:44.719492 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 16 00:10:44 crc kubenswrapper[4816]: E0316 00:10:44.720059 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-16 00:10:45.220039969 +0000 UTC m=+238.316339922 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:44 crc kubenswrapper[4816]: I0316 00:10:44.732275 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-58897d9998-q9xc9" Mar 16 00:10:44 crc kubenswrapper[4816]: I0316 00:10:44.735579 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-tv2n7" Mar 16 00:10:44 crc kubenswrapper[4816]: I0316 00:10:44.747895 4816 patch_prober.go:28] interesting pod/downloads-7954f5f757-5rr7c container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.14:8080/\": dial tcp 10.217.0.14:8080: connect: connection refused" start-of-body= Mar 16 00:10:44 crc kubenswrapper[4816]: I0316 00:10:44.747901 4816 patch_prober.go:28] interesting pod/downloads-7954f5f757-5rr7c container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.14:8080/\": dial tcp 10.217.0.14:8080: connect: connection refused" start-of-body= Mar 16 00:10:44 crc kubenswrapper[4816]: I0316 00:10:44.748525 4816 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-5rr7c" podUID="0ec3cdc0-f024-43cf-b520-7d2437e0f8df" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.14:8080/\": dial tcp 10.217.0.14:8080: connect: connection refused" Mar 16 00:10:44 crc kubenswrapper[4816]: I0316 00:10:44.747956 4816 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-5rr7c" podUID="0ec3cdc0-f024-43cf-b520-7d2437e0f8df" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.14:8080/\": dial tcp 10.217.0.14:8080: connect: connection refused" Mar 16 00:10:44 crc kubenswrapper[4816]: I0316 00:10:44.757216 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-6gbkl"] Mar 16 00:10:44 crc kubenswrapper[4816]: E0316 00:10:44.764050 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="59c840a8-f288-44ed-83d3-34d47041c6c6" containerName="controller-manager" Mar 16 00:10:44 crc kubenswrapper[4816]: I0316 00:10:44.764078 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="59c840a8-f288-44ed-83d3-34d47041c6c6" containerName="controller-manager" Mar 16 00:10:44 crc kubenswrapper[4816]: E0316 00:10:44.764090 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1d5466ab-a589-4f7e-ae89-2f494b10f6b1" containerName="route-controller-manager" Mar 16 00:10:44 crc kubenswrapper[4816]: I0316 00:10:44.764098 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="1d5466ab-a589-4f7e-ae89-2f494b10f6b1" containerName="route-controller-manager" Mar 16 00:10:44 crc kubenswrapper[4816]: I0316 00:10:44.764218 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="59c840a8-f288-44ed-83d3-34d47041c6c6" containerName="controller-manager" Mar 16 00:10:44 crc kubenswrapper[4816]: I0316 00:10:44.764237 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="1d5466ab-a589-4f7e-ae89-2f494b10f6b1" containerName="route-controller-manager" Mar 16 00:10:44 crc kubenswrapper[4816]: I0316 00:10:44.765507 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-6gbkl" Mar 16 00:10:44 crc kubenswrapper[4816]: I0316 00:10:44.783800 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-6gbkl"] Mar 16 00:10:44 crc kubenswrapper[4816]: I0316 00:10:44.811316 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-gtwmf" Mar 16 00:10:44 crc kubenswrapper[4816]: I0316 00:10:44.820423 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1d5466ab-a589-4f7e-ae89-2f494b10f6b1-config\") pod \"1d5466ab-a589-4f7e-ae89-2f494b10f6b1\" (UID: \"1d5466ab-a589-4f7e-ae89-2f494b10f6b1\") " Mar 16 00:10:44 crc kubenswrapper[4816]: I0316 00:10:44.820467 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1d5466ab-a589-4f7e-ae89-2f494b10f6b1-serving-cert\") pod \"1d5466ab-a589-4f7e-ae89-2f494b10f6b1\" (UID: \"1d5466ab-a589-4f7e-ae89-2f494b10f6b1\") " Mar 16 00:10:44 crc kubenswrapper[4816]: I0316 00:10:44.820507 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/59c840a8-f288-44ed-83d3-34d47041c6c6-config\") pod \"59c840a8-f288-44ed-83d3-34d47041c6c6\" (UID: \"59c840a8-f288-44ed-83d3-34d47041c6c6\") " Mar 16 00:10:44 crc kubenswrapper[4816]: I0316 00:10:44.820779 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fvtdx\" (UniqueName: \"kubernetes.io/projected/1d5466ab-a589-4f7e-ae89-2f494b10f6b1-kube-api-access-fvtdx\") pod \"1d5466ab-a589-4f7e-ae89-2f494b10f6b1\" (UID: \"1d5466ab-a589-4f7e-ae89-2f494b10f6b1\") " Mar 16 00:10:44 crc kubenswrapper[4816]: I0316 00:10:44.820812 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/1d5466ab-a589-4f7e-ae89-2f494b10f6b1-client-ca\") pod \"1d5466ab-a589-4f7e-ae89-2f494b10f6b1\" (UID: \"1d5466ab-a589-4f7e-ae89-2f494b10f6b1\") " Mar 16 00:10:44 crc kubenswrapper[4816]: I0316 00:10:44.820870 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/59c840a8-f288-44ed-83d3-34d47041c6c6-client-ca\") pod \"59c840a8-f288-44ed-83d3-34d47041c6c6\" (UID: \"59c840a8-f288-44ed-83d3-34d47041c6c6\") " Mar 16 00:10:44 crc kubenswrapper[4816]: I0316 00:10:44.820889 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/59c840a8-f288-44ed-83d3-34d47041c6c6-proxy-ca-bundles\") pod \"59c840a8-f288-44ed-83d3-34d47041c6c6\" (UID: \"59c840a8-f288-44ed-83d3-34d47041c6c6\") " Mar 16 00:10:44 crc kubenswrapper[4816]: I0316 00:10:44.820945 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zsbsw\" (UniqueName: \"kubernetes.io/projected/59c840a8-f288-44ed-83d3-34d47041c6c6-kube-api-access-zsbsw\") pod \"59c840a8-f288-44ed-83d3-34d47041c6c6\" (UID: \"59c840a8-f288-44ed-83d3-34d47041c6c6\") " Mar 16 00:10:44 crc kubenswrapper[4816]: I0316 00:10:44.821012 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/59c840a8-f288-44ed-83d3-34d47041c6c6-serving-cert\") pod \"59c840a8-f288-44ed-83d3-34d47041c6c6\" (UID: \"59c840a8-f288-44ed-83d3-34d47041c6c6\") " Mar 16 00:10:44 crc kubenswrapper[4816]: I0316 00:10:44.821261 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ckvwn\" (UID: \"b155133b-d494-44bc-aa5d-23efc7cbd7a6\") " pod="openshift-image-registry/image-registry-697d97f7c8-ckvwn" Mar 16 00:10:44 crc kubenswrapper[4816]: I0316 00:10:44.821390 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1d5466ab-a589-4f7e-ae89-2f494b10f6b1-config" (OuterVolumeSpecName: "config") pod "1d5466ab-a589-4f7e-ae89-2f494b10f6b1" (UID: "1d5466ab-a589-4f7e-ae89-2f494b10f6b1"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:10:44 crc kubenswrapper[4816]: I0316 00:10:44.821689 4816 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1d5466ab-a589-4f7e-ae89-2f494b10f6b1-config\") on node \"crc\" DevicePath \"\"" Mar 16 00:10:44 crc kubenswrapper[4816]: I0316 00:10:44.822170 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/59c840a8-f288-44ed-83d3-34d47041c6c6-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "59c840a8-f288-44ed-83d3-34d47041c6c6" (UID: "59c840a8-f288-44ed-83d3-34d47041c6c6"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:10:44 crc kubenswrapper[4816]: E0316 00:10:44.822260 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-16 00:10:45.322247749 +0000 UTC m=+238.418547702 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ckvwn" (UID: "b155133b-d494-44bc-aa5d-23efc7cbd7a6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:44 crc kubenswrapper[4816]: I0316 00:10:44.822599 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/59c840a8-f288-44ed-83d3-34d47041c6c6-client-ca" (OuterVolumeSpecName: "client-ca") pod "59c840a8-f288-44ed-83d3-34d47041c6c6" (UID: "59c840a8-f288-44ed-83d3-34d47041c6c6"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:10:44 crc kubenswrapper[4816]: I0316 00:10:44.823100 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/59c840a8-f288-44ed-83d3-34d47041c6c6-config" (OuterVolumeSpecName: "config") pod "59c840a8-f288-44ed-83d3-34d47041c6c6" (UID: "59c840a8-f288-44ed-83d3-34d47041c6c6"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:10:44 crc kubenswrapper[4816]: I0316 00:10:44.824636 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1d5466ab-a589-4f7e-ae89-2f494b10f6b1-client-ca" (OuterVolumeSpecName: "client-ca") pod "1d5466ab-a589-4f7e-ae89-2f494b10f6b1" (UID: "1d5466ab-a589-4f7e-ae89-2f494b10f6b1"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:10:44 crc kubenswrapper[4816]: I0316 00:10:44.827994 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/59c840a8-f288-44ed-83d3-34d47041c6c6-kube-api-access-zsbsw" (OuterVolumeSpecName: "kube-api-access-zsbsw") pod "59c840a8-f288-44ed-83d3-34d47041c6c6" (UID: "59c840a8-f288-44ed-83d3-34d47041c6c6"). InnerVolumeSpecName "kube-api-access-zsbsw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:10:44 crc kubenswrapper[4816]: I0316 00:10:44.828087 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d5466ab-a589-4f7e-ae89-2f494b10f6b1-kube-api-access-fvtdx" (OuterVolumeSpecName: "kube-api-access-fvtdx") pod "1d5466ab-a589-4f7e-ae89-2f494b10f6b1" (UID: "1d5466ab-a589-4f7e-ae89-2f494b10f6b1"). InnerVolumeSpecName "kube-api-access-fvtdx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:10:44 crc kubenswrapper[4816]: I0316 00:10:44.829085 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1d5466ab-a589-4f7e-ae89-2f494b10f6b1-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1d5466ab-a589-4f7e-ae89-2f494b10f6b1" (UID: "1d5466ab-a589-4f7e-ae89-2f494b10f6b1"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 16 00:10:44 crc kubenswrapper[4816]: I0316 00:10:44.841321 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/59c840a8-f288-44ed-83d3-34d47041c6c6-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "59c840a8-f288-44ed-83d3-34d47041c6c6" (UID: "59c840a8-f288-44ed-83d3-34d47041c6c6"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 16 00:10:44 crc kubenswrapper[4816]: I0316 00:10:44.888455 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-7pb49"] Mar 16 00:10:44 crc kubenswrapper[4816]: I0316 00:10:44.890057 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-zn6w7" Mar 16 00:10:44 crc kubenswrapper[4816]: I0316 00:10:44.923208 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 16 00:10:44 crc kubenswrapper[4816]: I0316 00:10:44.923379 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bad7b5f7-88a8-4c20-a010-734a46f59e05-catalog-content\") pod \"redhat-marketplace-6gbkl\" (UID: \"bad7b5f7-88a8-4c20-a010-734a46f59e05\") " pod="openshift-marketplace/redhat-marketplace-6gbkl" Mar 16 00:10:44 crc kubenswrapper[4816]: I0316 00:10:44.923468 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bad7b5f7-88a8-4c20-a010-734a46f59e05-utilities\") pod \"redhat-marketplace-6gbkl\" (UID: \"bad7b5f7-88a8-4c20-a010-734a46f59e05\") " pod="openshift-marketplace/redhat-marketplace-6gbkl" Mar 16 00:10:44 crc kubenswrapper[4816]: I0316 00:10:44.923524 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l2d9d\" (UniqueName: \"kubernetes.io/projected/bad7b5f7-88a8-4c20-a010-734a46f59e05-kube-api-access-l2d9d\") pod \"redhat-marketplace-6gbkl\" (UID: \"bad7b5f7-88a8-4c20-a010-734a46f59e05\") " pod="openshift-marketplace/redhat-marketplace-6gbkl" Mar 16 00:10:44 crc kubenswrapper[4816]: I0316 00:10:44.923597 4816 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/1d5466ab-a589-4f7e-ae89-2f494b10f6b1-client-ca\") on node \"crc\" DevicePath \"\"" Mar 16 00:10:44 crc kubenswrapper[4816]: I0316 00:10:44.923611 4816 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/59c840a8-f288-44ed-83d3-34d47041c6c6-client-ca\") on node \"crc\" DevicePath \"\"" Mar 16 00:10:44 crc kubenswrapper[4816]: I0316 00:10:44.923621 4816 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/59c840a8-f288-44ed-83d3-34d47041c6c6-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 16 00:10:44 crc kubenswrapper[4816]: I0316 00:10:44.923630 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zsbsw\" (UniqueName: \"kubernetes.io/projected/59c840a8-f288-44ed-83d3-34d47041c6c6-kube-api-access-zsbsw\") on node \"crc\" DevicePath \"\"" Mar 16 00:10:44 crc kubenswrapper[4816]: I0316 00:10:44.923639 4816 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/59c840a8-f288-44ed-83d3-34d47041c6c6-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 16 00:10:44 crc kubenswrapper[4816]: I0316 00:10:44.923647 4816 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1d5466ab-a589-4f7e-ae89-2f494b10f6b1-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 16 00:10:44 crc kubenswrapper[4816]: I0316 00:10:44.923657 4816 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/59c840a8-f288-44ed-83d3-34d47041c6c6-config\") on node \"crc\" DevicePath \"\"" Mar 16 00:10:44 crc kubenswrapper[4816]: I0316 00:10:44.923667 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fvtdx\" (UniqueName: \"kubernetes.io/projected/1d5466ab-a589-4f7e-ae89-2f494b10f6b1-kube-api-access-fvtdx\") on node \"crc\" DevicePath \"\"" Mar 16 00:10:44 crc kubenswrapper[4816]: E0316 00:10:44.924195 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-16 00:10:45.424161842 +0000 UTC m=+238.520461815 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:44 crc kubenswrapper[4816]: I0316 00:10:44.930753 4816 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock","Timestamp":"2026-03-16T00:10:44.524432908Z","Handler":null,"Name":""} Mar 16 00:10:44 crc kubenswrapper[4816]: I0316 00:10:44.934932 4816 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: kubevirt.io.hostpath-provisioner endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock versions: 1.0.0 Mar 16 00:10:44 crc kubenswrapper[4816]: I0316 00:10:44.934968 4816 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: kubevirt.io.hostpath-provisioner at endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock Mar 16 00:10:45 crc kubenswrapper[4816]: I0316 00:10:45.025031 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ckvwn\" (UID: \"b155133b-d494-44bc-aa5d-23efc7cbd7a6\") " pod="openshift-image-registry/image-registry-697d97f7c8-ckvwn" Mar 16 00:10:45 crc kubenswrapper[4816]: I0316 00:10:45.025103 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bad7b5f7-88a8-4c20-a010-734a46f59e05-utilities\") pod \"redhat-marketplace-6gbkl\" (UID: \"bad7b5f7-88a8-4c20-a010-734a46f59e05\") " pod="openshift-marketplace/redhat-marketplace-6gbkl" Mar 16 00:10:45 crc kubenswrapper[4816]: I0316 00:10:45.025166 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l2d9d\" (UniqueName: \"kubernetes.io/projected/bad7b5f7-88a8-4c20-a010-734a46f59e05-kube-api-access-l2d9d\") pod \"redhat-marketplace-6gbkl\" (UID: \"bad7b5f7-88a8-4c20-a010-734a46f59e05\") " pod="openshift-marketplace/redhat-marketplace-6gbkl" Mar 16 00:10:45 crc kubenswrapper[4816]: I0316 00:10:45.025245 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bad7b5f7-88a8-4c20-a010-734a46f59e05-catalog-content\") pod \"redhat-marketplace-6gbkl\" (UID: \"bad7b5f7-88a8-4c20-a010-734a46f59e05\") " pod="openshift-marketplace/redhat-marketplace-6gbkl" Mar 16 00:10:45 crc kubenswrapper[4816]: I0316 00:10:45.025968 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bad7b5f7-88a8-4c20-a010-734a46f59e05-catalog-content\") pod \"redhat-marketplace-6gbkl\" (UID: \"bad7b5f7-88a8-4c20-a010-734a46f59e05\") " pod="openshift-marketplace/redhat-marketplace-6gbkl" Mar 16 00:10:45 crc kubenswrapper[4816]: I0316 00:10:45.026830 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bad7b5f7-88a8-4c20-a010-734a46f59e05-utilities\") pod \"redhat-marketplace-6gbkl\" (UID: \"bad7b5f7-88a8-4c20-a010-734a46f59e05\") " pod="openshift-marketplace/redhat-marketplace-6gbkl" Mar 16 00:10:45 crc kubenswrapper[4816]: I0316 00:10:45.029089 4816 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 16 00:10:45 crc kubenswrapper[4816]: I0316 00:10:45.029121 4816 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ckvwn\" (UID: \"b155133b-d494-44bc-aa5d-23efc7cbd7a6\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount\"" pod="openshift-image-registry/image-registry-697d97f7c8-ckvwn" Mar 16 00:10:45 crc kubenswrapper[4816]: I0316 00:10:45.048520 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l2d9d\" (UniqueName: \"kubernetes.io/projected/bad7b5f7-88a8-4c20-a010-734a46f59e05-kube-api-access-l2d9d\") pod \"redhat-marketplace-6gbkl\" (UID: \"bad7b5f7-88a8-4c20-a010-734a46f59e05\") " pod="openshift-marketplace/redhat-marketplace-6gbkl" Mar 16 00:10:45 crc kubenswrapper[4816]: I0316 00:10:45.077501 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ckvwn\" (UID: \"b155133b-d494-44bc-aa5d-23efc7cbd7a6\") " pod="openshift-image-registry/image-registry-697d97f7c8-ckvwn" Mar 16 00:10:45 crc kubenswrapper[4816]: I0316 00:10:45.098390 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-6gbkl" Mar 16 00:10:45 crc kubenswrapper[4816]: I0316 00:10:45.128852 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 16 00:10:45 crc kubenswrapper[4816]: I0316 00:10:45.136649 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Mar 16 00:10:45 crc kubenswrapper[4816]: I0316 00:10:45.139650 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-nsxl4" Mar 16 00:10:45 crc kubenswrapper[4816]: I0316 00:10:45.144804 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-nsxl4" Mar 16 00:10:45 crc kubenswrapper[4816]: I0316 00:10:45.227253 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-8226q" Mar 16 00:10:45 crc kubenswrapper[4816]: I0316 00:10:45.232482 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-8226q" Mar 16 00:10:45 crc kubenswrapper[4816]: I0316 00:10:45.276307 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-ckvwn" Mar 16 00:10:45 crc kubenswrapper[4816]: I0316 00:10:45.280156 4816 ???:1] "http: TLS handshake error from 192.168.126.11:55002: no serving certificate available for the kubelet" Mar 16 00:10:45 crc kubenswrapper[4816]: I0316 00:10:45.332832 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-558db77b4-sshl5" Mar 16 00:10:45 crc kubenswrapper[4816]: I0316 00:10:45.379827 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-64d6fc58d9-vgh6t"] Mar 16 00:10:45 crc kubenswrapper[4816]: I0316 00:10:45.380867 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-64d6fc58d9-vgh6t" Mar 16 00:10:45 crc kubenswrapper[4816]: I0316 00:10:45.387311 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6d597ffc5b-jhblv"] Mar 16 00:10:45 crc kubenswrapper[4816]: I0316 00:10:45.389107 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6d597ffc5b-jhblv" Mar 16 00:10:45 crc kubenswrapper[4816]: I0316 00:10:45.398606 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-64d6fc58d9-vgh6t"] Mar 16 00:10:45 crc kubenswrapper[4816]: I0316 00:10:45.412329 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6d597ffc5b-jhblv"] Mar 16 00:10:45 crc kubenswrapper[4816]: I0316 00:10:45.494157 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-d9j8j" event={"ID":"1d5466ab-a589-4f7e-ae89-2f494b10f6b1","Type":"ContainerDied","Data":"8bd92ab2e8746013ff96fbb3362f4a912a98fe884156f1b95b8704505ab4fe1a"} Mar 16 00:10:45 crc kubenswrapper[4816]: I0316 00:10:45.494233 4816 scope.go:117] "RemoveContainer" containerID="e90fdfac87f05e45b64d63ce5cb4d5902fbd18d9c1d580577069351527db0c29" Mar 16 00:10:45 crc kubenswrapper[4816]: I0316 00:10:45.494397 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-d9j8j" Mar 16 00:10:45 crc kubenswrapper[4816]: I0316 00:10:45.512210 4816 generic.go:334] "Generic (PLEG): container finished" podID="5c90a604-be49-44dc-b350-9df660d8587b" containerID="de4df241bf0429f4c7d3687854c459dc61daeb2dd35192a1c0611d80c7988415" exitCode=0 Mar 16 00:10:45 crc kubenswrapper[4816]: I0316 00:10:45.512277 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"5c90a604-be49-44dc-b350-9df660d8587b","Type":"ContainerDied","Data":"de4df241bf0429f4c7d3687854c459dc61daeb2dd35192a1c0611d80c7988415"} Mar 16 00:10:45 crc kubenswrapper[4816]: I0316 00:10:45.525837 4816 generic.go:334] "Generic (PLEG): container finished" podID="46c3b4df-48c2-4131-8ac4-ea6276d70d54" containerID="cf91bdab1d487b3e9c82088df50a44f305fd300ad681fc9e9b0b6cb01f350748" exitCode=0 Mar 16 00:10:45 crc kubenswrapper[4816]: I0316 00:10:45.526009 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"46c3b4df-48c2-4131-8ac4-ea6276d70d54","Type":"ContainerDied","Data":"cf91bdab1d487b3e9c82088df50a44f305fd300ad681fc9e9b0b6cb01f350748"} Mar 16 00:10:45 crc kubenswrapper[4816]: I0316 00:10:45.529675 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-d9j8j"] Mar 16 00:10:45 crc kubenswrapper[4816]: I0316 00:10:45.529723 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-trp9l" event={"ID":"a1a13dfd-7f8c-4a52-9ace-ffb14d3d91f2","Type":"ContainerStarted","Data":"aa1149a96b819d638525f60a56566f3acd4e8b31d993c4dcdb3dcaa8740d2a99"} Mar 16 00:10:45 crc kubenswrapper[4816]: I0316 00:10:45.535108 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-d9j8j"] Mar 16 00:10:45 crc kubenswrapper[4816]: I0316 00:10:45.535491 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-tv2n7" event={"ID":"59c840a8-f288-44ed-83d3-34d47041c6c6","Type":"ContainerDied","Data":"360f090f6a27a9d9ebb782602e54104c845a3d5e91127b115ef7d468e384ebfe"} Mar 16 00:10:45 crc kubenswrapper[4816]: I0316 00:10:45.535798 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-tv2n7" Mar 16 00:10:45 crc kubenswrapper[4816]: I0316 00:10:45.538491 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/1feb17b5-7946-4727-a954-d516a9b8469b-proxy-ca-bundles\") pod \"controller-manager-64d6fc58d9-vgh6t\" (UID: \"1feb17b5-7946-4727-a954-d516a9b8469b\") " pod="openshift-controller-manager/controller-manager-64d6fc58d9-vgh6t" Mar 16 00:10:45 crc kubenswrapper[4816]: I0316 00:10:45.538567 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jnvd7\" (UniqueName: \"kubernetes.io/projected/1feb17b5-7946-4727-a954-d516a9b8469b-kube-api-access-jnvd7\") pod \"controller-manager-64d6fc58d9-vgh6t\" (UID: \"1feb17b5-7946-4727-a954-d516a9b8469b\") " pod="openshift-controller-manager/controller-manager-64d6fc58d9-vgh6t" Mar 16 00:10:45 crc kubenswrapper[4816]: I0316 00:10:45.538591 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qt6qg\" (UniqueName: \"kubernetes.io/projected/93c061cd-ed29-4f9c-ad3a-0fb204ce6f8d-kube-api-access-qt6qg\") pod \"route-controller-manager-6d597ffc5b-jhblv\" (UID: \"93c061cd-ed29-4f9c-ad3a-0fb204ce6f8d\") " pod="openshift-route-controller-manager/route-controller-manager-6d597ffc5b-jhblv" Mar 16 00:10:45 crc kubenswrapper[4816]: I0316 00:10:45.538709 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1feb17b5-7946-4727-a954-d516a9b8469b-serving-cert\") pod \"controller-manager-64d6fc58d9-vgh6t\" (UID: \"1feb17b5-7946-4727-a954-d516a9b8469b\") " pod="openshift-controller-manager/controller-manager-64d6fc58d9-vgh6t" Mar 16 00:10:45 crc kubenswrapper[4816]: I0316 00:10:45.538753 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/1feb17b5-7946-4727-a954-d516a9b8469b-client-ca\") pod \"controller-manager-64d6fc58d9-vgh6t\" (UID: \"1feb17b5-7946-4727-a954-d516a9b8469b\") " pod="openshift-controller-manager/controller-manager-64d6fc58d9-vgh6t" Mar 16 00:10:45 crc kubenswrapper[4816]: I0316 00:10:45.538801 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/93c061cd-ed29-4f9c-ad3a-0fb204ce6f8d-config\") pod \"route-controller-manager-6d597ffc5b-jhblv\" (UID: \"93c061cd-ed29-4f9c-ad3a-0fb204ce6f8d\") " pod="openshift-route-controller-manager/route-controller-manager-6d597ffc5b-jhblv" Mar 16 00:10:45 crc kubenswrapper[4816]: I0316 00:10:45.538868 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1feb17b5-7946-4727-a954-d516a9b8469b-config\") pod \"controller-manager-64d6fc58d9-vgh6t\" (UID: \"1feb17b5-7946-4727-a954-d516a9b8469b\") " pod="openshift-controller-manager/controller-manager-64d6fc58d9-vgh6t" Mar 16 00:10:45 crc kubenswrapper[4816]: I0316 00:10:45.538965 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/93c061cd-ed29-4f9c-ad3a-0fb204ce6f8d-client-ca\") pod \"route-controller-manager-6d597ffc5b-jhblv\" (UID: \"93c061cd-ed29-4f9c-ad3a-0fb204ce6f8d\") " pod="openshift-route-controller-manager/route-controller-manager-6d597ffc5b-jhblv" Mar 16 00:10:45 crc kubenswrapper[4816]: I0316 00:10:45.539033 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/93c061cd-ed29-4f9c-ad3a-0fb204ce6f8d-serving-cert\") pod \"route-controller-manager-6d597ffc5b-jhblv\" (UID: \"93c061cd-ed29-4f9c-ad3a-0fb204ce6f8d\") " pod="openshift-route-controller-manager/route-controller-manager-6d597ffc5b-jhblv" Mar 16 00:10:45 crc kubenswrapper[4816]: I0316 00:10:45.558773 4816 patch_prober.go:28] interesting pod/router-default-5444994796-gvk75 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 16 00:10:45 crc kubenswrapper[4816]: [-]has-synced failed: reason withheld Mar 16 00:10:45 crc kubenswrapper[4816]: [+]process-running ok Mar 16 00:10:45 crc kubenswrapper[4816]: healthz check failed Mar 16 00:10:45 crc kubenswrapper[4816]: I0316 00:10:45.558853 4816 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-gvk75" podUID="681ca8e4-f909-4e8b-9f35-5ab8ca382e44" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 16 00:10:45 crc kubenswrapper[4816]: I0316 00:10:45.602171 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="hostpath-provisioner/csi-hostpathplugin-trp9l" podStartSLOduration=14.602152913 podStartE2EDuration="14.602152913s" podCreationTimestamp="2026-03-16 00:10:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-16 00:10:45.600293382 +0000 UTC m=+238.696593335" watchObservedRunningTime="2026-03-16 00:10:45.602152913 +0000 UTC m=+238.698452866" Mar 16 00:10:45 crc kubenswrapper[4816]: I0316 00:10:45.640679 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/93c061cd-ed29-4f9c-ad3a-0fb204ce6f8d-client-ca\") pod \"route-controller-manager-6d597ffc5b-jhblv\" (UID: \"93c061cd-ed29-4f9c-ad3a-0fb204ce6f8d\") " pod="openshift-route-controller-manager/route-controller-manager-6d597ffc5b-jhblv" Mar 16 00:10:45 crc kubenswrapper[4816]: I0316 00:10:45.640739 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/93c061cd-ed29-4f9c-ad3a-0fb204ce6f8d-serving-cert\") pod \"route-controller-manager-6d597ffc5b-jhblv\" (UID: \"93c061cd-ed29-4f9c-ad3a-0fb204ce6f8d\") " pod="openshift-route-controller-manager/route-controller-manager-6d597ffc5b-jhblv" Mar 16 00:10:45 crc kubenswrapper[4816]: I0316 00:10:45.641485 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/1feb17b5-7946-4727-a954-d516a9b8469b-proxy-ca-bundles\") pod \"controller-manager-64d6fc58d9-vgh6t\" (UID: \"1feb17b5-7946-4727-a954-d516a9b8469b\") " pod="openshift-controller-manager/controller-manager-64d6fc58d9-vgh6t" Mar 16 00:10:45 crc kubenswrapper[4816]: I0316 00:10:45.641586 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jnvd7\" (UniqueName: \"kubernetes.io/projected/1feb17b5-7946-4727-a954-d516a9b8469b-kube-api-access-jnvd7\") pod \"controller-manager-64d6fc58d9-vgh6t\" (UID: \"1feb17b5-7946-4727-a954-d516a9b8469b\") " pod="openshift-controller-manager/controller-manager-64d6fc58d9-vgh6t" Mar 16 00:10:45 crc kubenswrapper[4816]: I0316 00:10:45.641612 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qt6qg\" (UniqueName: \"kubernetes.io/projected/93c061cd-ed29-4f9c-ad3a-0fb204ce6f8d-kube-api-access-qt6qg\") pod \"route-controller-manager-6d597ffc5b-jhblv\" (UID: \"93c061cd-ed29-4f9c-ad3a-0fb204ce6f8d\") " pod="openshift-route-controller-manager/route-controller-manager-6d597ffc5b-jhblv" Mar 16 00:10:45 crc kubenswrapper[4816]: I0316 00:10:45.641667 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1feb17b5-7946-4727-a954-d516a9b8469b-serving-cert\") pod \"controller-manager-64d6fc58d9-vgh6t\" (UID: \"1feb17b5-7946-4727-a954-d516a9b8469b\") " pod="openshift-controller-manager/controller-manager-64d6fc58d9-vgh6t" Mar 16 00:10:45 crc kubenswrapper[4816]: I0316 00:10:45.641693 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/1feb17b5-7946-4727-a954-d516a9b8469b-client-ca\") pod \"controller-manager-64d6fc58d9-vgh6t\" (UID: \"1feb17b5-7946-4727-a954-d516a9b8469b\") " pod="openshift-controller-manager/controller-manager-64d6fc58d9-vgh6t" Mar 16 00:10:45 crc kubenswrapper[4816]: I0316 00:10:45.641820 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/93c061cd-ed29-4f9c-ad3a-0fb204ce6f8d-config\") pod \"route-controller-manager-6d597ffc5b-jhblv\" (UID: \"93c061cd-ed29-4f9c-ad3a-0fb204ce6f8d\") " pod="openshift-route-controller-manager/route-controller-manager-6d597ffc5b-jhblv" Mar 16 00:10:45 crc kubenswrapper[4816]: I0316 00:10:45.641856 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1feb17b5-7946-4727-a954-d516a9b8469b-config\") pod \"controller-manager-64d6fc58d9-vgh6t\" (UID: \"1feb17b5-7946-4727-a954-d516a9b8469b\") " pod="openshift-controller-manager/controller-manager-64d6fc58d9-vgh6t" Mar 16 00:10:45 crc kubenswrapper[4816]: I0316 00:10:45.642967 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/1feb17b5-7946-4727-a954-d516a9b8469b-proxy-ca-bundles\") pod \"controller-manager-64d6fc58d9-vgh6t\" (UID: \"1feb17b5-7946-4727-a954-d516a9b8469b\") " pod="openshift-controller-manager/controller-manager-64d6fc58d9-vgh6t" Mar 16 00:10:45 crc kubenswrapper[4816]: I0316 00:10:45.643239 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/1feb17b5-7946-4727-a954-d516a9b8469b-client-ca\") pod \"controller-manager-64d6fc58d9-vgh6t\" (UID: \"1feb17b5-7946-4727-a954-d516a9b8469b\") " pod="openshift-controller-manager/controller-manager-64d6fc58d9-vgh6t" Mar 16 00:10:45 crc kubenswrapper[4816]: I0316 00:10:45.644113 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/93c061cd-ed29-4f9c-ad3a-0fb204ce6f8d-client-ca\") pod \"route-controller-manager-6d597ffc5b-jhblv\" (UID: \"93c061cd-ed29-4f9c-ad3a-0fb204ce6f8d\") " pod="openshift-route-controller-manager/route-controller-manager-6d597ffc5b-jhblv" Mar 16 00:10:45 crc kubenswrapper[4816]: I0316 00:10:45.644145 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/93c061cd-ed29-4f9c-ad3a-0fb204ce6f8d-config\") pod \"route-controller-manager-6d597ffc5b-jhblv\" (UID: \"93c061cd-ed29-4f9c-ad3a-0fb204ce6f8d\") " pod="openshift-route-controller-manager/route-controller-manager-6d597ffc5b-jhblv" Mar 16 00:10:45 crc kubenswrapper[4816]: I0316 00:10:45.644206 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1feb17b5-7946-4727-a954-d516a9b8469b-config\") pod \"controller-manager-64d6fc58d9-vgh6t\" (UID: \"1feb17b5-7946-4727-a954-d516a9b8469b\") " pod="openshift-controller-manager/controller-manager-64d6fc58d9-vgh6t" Mar 16 00:10:45 crc kubenswrapper[4816]: I0316 00:10:45.661121 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/93c061cd-ed29-4f9c-ad3a-0fb204ce6f8d-serving-cert\") pod \"route-controller-manager-6d597ffc5b-jhblv\" (UID: \"93c061cd-ed29-4f9c-ad3a-0fb204ce6f8d\") " pod="openshift-route-controller-manager/route-controller-manager-6d597ffc5b-jhblv" Mar 16 00:10:45 crc kubenswrapper[4816]: I0316 00:10:45.670905 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qt6qg\" (UniqueName: \"kubernetes.io/projected/93c061cd-ed29-4f9c-ad3a-0fb204ce6f8d-kube-api-access-qt6qg\") pod \"route-controller-manager-6d597ffc5b-jhblv\" (UID: \"93c061cd-ed29-4f9c-ad3a-0fb204ce6f8d\") " pod="openshift-route-controller-manager/route-controller-manager-6d597ffc5b-jhblv" Mar 16 00:10:45 crc kubenswrapper[4816]: I0316 00:10:45.677916 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d5466ab-a589-4f7e-ae89-2f494b10f6b1" path="/var/lib/kubelet/pods/1d5466ab-a589-4f7e-ae89-2f494b10f6b1/volumes" Mar 16 00:10:45 crc kubenswrapper[4816]: I0316 00:10:45.686378 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1feb17b5-7946-4727-a954-d516a9b8469b-serving-cert\") pod \"controller-manager-64d6fc58d9-vgh6t\" (UID: \"1feb17b5-7946-4727-a954-d516a9b8469b\") " pod="openshift-controller-manager/controller-manager-64d6fc58d9-vgh6t" Mar 16 00:10:45 crc kubenswrapper[4816]: I0316 00:10:45.690000 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jnvd7\" (UniqueName: \"kubernetes.io/projected/1feb17b5-7946-4727-a954-d516a9b8469b-kube-api-access-jnvd7\") pod \"controller-manager-64d6fc58d9-vgh6t\" (UID: \"1feb17b5-7946-4727-a954-d516a9b8469b\") " pod="openshift-controller-manager/controller-manager-64d6fc58d9-vgh6t" Mar 16 00:10:45 crc kubenswrapper[4816]: I0316 00:10:45.700853 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-64d6fc58d9-vgh6t" Mar 16 00:10:45 crc kubenswrapper[4816]: I0316 00:10:45.705905 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6d597ffc5b-jhblv" Mar 16 00:10:45 crc kubenswrapper[4816]: I0316 00:10:45.714404 4816 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-tv2n7 container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.25:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 16 00:10:45 crc kubenswrapper[4816]: I0316 00:10:45.714480 4816 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-tv2n7" podUID="59c840a8-f288-44ed-83d3-34d47041c6c6" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.25:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 16 00:10:45 crc kubenswrapper[4816]: I0316 00:10:45.729261 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f668bae-612b-4b75-9490-919e737c6a3b" path="/var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes" Mar 16 00:10:45 crc kubenswrapper[4816]: I0316 00:10:45.730162 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-tv2n7"] Mar 16 00:10:45 crc kubenswrapper[4816]: I0316 00:10:45.730193 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-tv2n7"] Mar 16 00:10:45 crc kubenswrapper[4816]: I0316 00:10:45.738802 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-52qs6"] Mar 16 00:10:45 crc kubenswrapper[4816]: I0316 00:10:45.740400 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-52qs6" Mar 16 00:10:45 crc kubenswrapper[4816]: I0316 00:10:45.743233 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Mar 16 00:10:45 crc kubenswrapper[4816]: I0316 00:10:45.748403 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-52qs6"] Mar 16 00:10:45 crc kubenswrapper[4816]: I0316 00:10:45.843668 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6ca6c2c9-3a12-4eb3-9df1-7fdea640791d-catalog-content\") pod \"redhat-operators-52qs6\" (UID: \"6ca6c2c9-3a12-4eb3-9df1-7fdea640791d\") " pod="openshift-marketplace/redhat-operators-52qs6" Mar 16 00:10:45 crc kubenswrapper[4816]: I0316 00:10:45.843728 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mrpff\" (UniqueName: \"kubernetes.io/projected/6ca6c2c9-3a12-4eb3-9df1-7fdea640791d-kube-api-access-mrpff\") pod \"redhat-operators-52qs6\" (UID: \"6ca6c2c9-3a12-4eb3-9df1-7fdea640791d\") " pod="openshift-marketplace/redhat-operators-52qs6" Mar 16 00:10:45 crc kubenswrapper[4816]: I0316 00:10:45.843765 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6ca6c2c9-3a12-4eb3-9df1-7fdea640791d-utilities\") pod \"redhat-operators-52qs6\" (UID: \"6ca6c2c9-3a12-4eb3-9df1-7fdea640791d\") " pod="openshift-marketplace/redhat-operators-52qs6" Mar 16 00:10:45 crc kubenswrapper[4816]: I0316 00:10:45.945336 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6ca6c2c9-3a12-4eb3-9df1-7fdea640791d-catalog-content\") pod \"redhat-operators-52qs6\" (UID: \"6ca6c2c9-3a12-4eb3-9df1-7fdea640791d\") " pod="openshift-marketplace/redhat-operators-52qs6" Mar 16 00:10:45 crc kubenswrapper[4816]: I0316 00:10:45.945396 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mrpff\" (UniqueName: \"kubernetes.io/projected/6ca6c2c9-3a12-4eb3-9df1-7fdea640791d-kube-api-access-mrpff\") pod \"redhat-operators-52qs6\" (UID: \"6ca6c2c9-3a12-4eb3-9df1-7fdea640791d\") " pod="openshift-marketplace/redhat-operators-52qs6" Mar 16 00:10:45 crc kubenswrapper[4816]: I0316 00:10:45.945436 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6ca6c2c9-3a12-4eb3-9df1-7fdea640791d-utilities\") pod \"redhat-operators-52qs6\" (UID: \"6ca6c2c9-3a12-4eb3-9df1-7fdea640791d\") " pod="openshift-marketplace/redhat-operators-52qs6" Mar 16 00:10:45 crc kubenswrapper[4816]: I0316 00:10:45.945862 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6ca6c2c9-3a12-4eb3-9df1-7fdea640791d-utilities\") pod \"redhat-operators-52qs6\" (UID: \"6ca6c2c9-3a12-4eb3-9df1-7fdea640791d\") " pod="openshift-marketplace/redhat-operators-52qs6" Mar 16 00:10:45 crc kubenswrapper[4816]: I0316 00:10:45.946143 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6ca6c2c9-3a12-4eb3-9df1-7fdea640791d-catalog-content\") pod \"redhat-operators-52qs6\" (UID: \"6ca6c2c9-3a12-4eb3-9df1-7fdea640791d\") " pod="openshift-marketplace/redhat-operators-52qs6" Mar 16 00:10:45 crc kubenswrapper[4816]: I0316 00:10:45.974667 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mrpff\" (UniqueName: \"kubernetes.io/projected/6ca6c2c9-3a12-4eb3-9df1-7fdea640791d-kube-api-access-mrpff\") pod \"redhat-operators-52qs6\" (UID: \"6ca6c2c9-3a12-4eb3-9df1-7fdea640791d\") " pod="openshift-marketplace/redhat-operators-52qs6" Mar 16 00:10:46 crc kubenswrapper[4816]: I0316 00:10:46.082756 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-52qs6" Mar 16 00:10:46 crc kubenswrapper[4816]: I0316 00:10:46.136737 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-hvpqn"] Mar 16 00:10:46 crc kubenswrapper[4816]: I0316 00:10:46.137746 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-hvpqn" Mar 16 00:10:46 crc kubenswrapper[4816]: I0316 00:10:46.148417 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-hvpqn"] Mar 16 00:10:46 crc kubenswrapper[4816]: I0316 00:10:46.249143 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qk7j9\" (UniqueName: \"kubernetes.io/projected/cc1ea93d-1cf8-4145-ad35-83f2d1357f9d-kube-api-access-qk7j9\") pod \"redhat-operators-hvpqn\" (UID: \"cc1ea93d-1cf8-4145-ad35-83f2d1357f9d\") " pod="openshift-marketplace/redhat-operators-hvpqn" Mar 16 00:10:46 crc kubenswrapper[4816]: I0316 00:10:46.249238 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cc1ea93d-1cf8-4145-ad35-83f2d1357f9d-catalog-content\") pod \"redhat-operators-hvpqn\" (UID: \"cc1ea93d-1cf8-4145-ad35-83f2d1357f9d\") " pod="openshift-marketplace/redhat-operators-hvpqn" Mar 16 00:10:46 crc kubenswrapper[4816]: I0316 00:10:46.249269 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cc1ea93d-1cf8-4145-ad35-83f2d1357f9d-utilities\") pod \"redhat-operators-hvpqn\" (UID: \"cc1ea93d-1cf8-4145-ad35-83f2d1357f9d\") " pod="openshift-marketplace/redhat-operators-hvpqn" Mar 16 00:10:46 crc kubenswrapper[4816]: I0316 00:10:46.351124 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qk7j9\" (UniqueName: \"kubernetes.io/projected/cc1ea93d-1cf8-4145-ad35-83f2d1357f9d-kube-api-access-qk7j9\") pod \"redhat-operators-hvpqn\" (UID: \"cc1ea93d-1cf8-4145-ad35-83f2d1357f9d\") " pod="openshift-marketplace/redhat-operators-hvpqn" Mar 16 00:10:46 crc kubenswrapper[4816]: I0316 00:10:46.351224 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cc1ea93d-1cf8-4145-ad35-83f2d1357f9d-catalog-content\") pod \"redhat-operators-hvpqn\" (UID: \"cc1ea93d-1cf8-4145-ad35-83f2d1357f9d\") " pod="openshift-marketplace/redhat-operators-hvpqn" Mar 16 00:10:46 crc kubenswrapper[4816]: I0316 00:10:46.351242 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cc1ea93d-1cf8-4145-ad35-83f2d1357f9d-utilities\") pod \"redhat-operators-hvpqn\" (UID: \"cc1ea93d-1cf8-4145-ad35-83f2d1357f9d\") " pod="openshift-marketplace/redhat-operators-hvpqn" Mar 16 00:10:46 crc kubenswrapper[4816]: I0316 00:10:46.351771 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cc1ea93d-1cf8-4145-ad35-83f2d1357f9d-catalog-content\") pod \"redhat-operators-hvpqn\" (UID: \"cc1ea93d-1cf8-4145-ad35-83f2d1357f9d\") " pod="openshift-marketplace/redhat-operators-hvpqn" Mar 16 00:10:46 crc kubenswrapper[4816]: I0316 00:10:46.351784 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cc1ea93d-1cf8-4145-ad35-83f2d1357f9d-utilities\") pod \"redhat-operators-hvpqn\" (UID: \"cc1ea93d-1cf8-4145-ad35-83f2d1357f9d\") " pod="openshift-marketplace/redhat-operators-hvpqn" Mar 16 00:10:46 crc kubenswrapper[4816]: I0316 00:10:46.369027 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qk7j9\" (UniqueName: \"kubernetes.io/projected/cc1ea93d-1cf8-4145-ad35-83f2d1357f9d-kube-api-access-qk7j9\") pod \"redhat-operators-hvpqn\" (UID: \"cc1ea93d-1cf8-4145-ad35-83f2d1357f9d\") " pod="openshift-marketplace/redhat-operators-hvpqn" Mar 16 00:10:46 crc kubenswrapper[4816]: I0316 00:10:46.471669 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-hvpqn" Mar 16 00:10:46 crc kubenswrapper[4816]: I0316 00:10:46.554138 4816 patch_prober.go:28] interesting pod/router-default-5444994796-gvk75 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 16 00:10:46 crc kubenswrapper[4816]: [-]has-synced failed: reason withheld Mar 16 00:10:46 crc kubenswrapper[4816]: [+]process-running ok Mar 16 00:10:46 crc kubenswrapper[4816]: healthz check failed Mar 16 00:10:46 crc kubenswrapper[4816]: I0316 00:10:46.554387 4816 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-gvk75" podUID="681ca8e4-f909-4e8b-9f35-5ab8ca382e44" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 16 00:10:47 crc kubenswrapper[4816]: I0316 00:10:47.553283 4816 patch_prober.go:28] interesting pod/router-default-5444994796-gvk75 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 16 00:10:47 crc kubenswrapper[4816]: [-]has-synced failed: reason withheld Mar 16 00:10:47 crc kubenswrapper[4816]: [+]process-running ok Mar 16 00:10:47 crc kubenswrapper[4816]: healthz check failed Mar 16 00:10:47 crc kubenswrapper[4816]: I0316 00:10:47.553365 4816 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-gvk75" podUID="681ca8e4-f909-4e8b-9f35-5ab8ca382e44" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 16 00:10:47 crc kubenswrapper[4816]: I0316 00:10:47.685573 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="59c840a8-f288-44ed-83d3-34d47041c6c6" path="/var/lib/kubelet/pods/59c840a8-f288-44ed-83d3-34d47041c6c6/volumes" Mar 16 00:10:48 crc kubenswrapper[4816]: I0316 00:10:48.333195 4816 ???:1] "http: TLS handshake error from 192.168.126.11:52476: no serving certificate available for the kubelet" Mar 16 00:10:48 crc kubenswrapper[4816]: I0316 00:10:48.553620 4816 patch_prober.go:28] interesting pod/router-default-5444994796-gvk75 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 16 00:10:48 crc kubenswrapper[4816]: [-]has-synced failed: reason withheld Mar 16 00:10:48 crc kubenswrapper[4816]: [+]process-running ok Mar 16 00:10:48 crc kubenswrapper[4816]: healthz check failed Mar 16 00:10:48 crc kubenswrapper[4816]: I0316 00:10:48.553682 4816 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-gvk75" podUID="681ca8e4-f909-4e8b-9f35-5ab8ca382e44" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 16 00:10:48 crc kubenswrapper[4816]: I0316 00:10:48.954294 4816 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-apiserver/apiserver-76f77b778f-pdm8d" Mar 16 00:10:48 crc kubenswrapper[4816]: I0316 00:10:48.961503 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-apiserver/apiserver-76f77b778f-pdm8d" Mar 16 00:10:49 crc kubenswrapper[4816]: W0316 00:10:49.264622 4816 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda5ba22dd_8e8e_4beb_a540_e5c9687810b8.slice/crio-7718a309c71ba8a48a463087b2e901f51d954ea050a7be786e3c0a847d6a54eb WatchSource:0}: Error finding container 7718a309c71ba8a48a463087b2e901f51d954ea050a7be786e3c0a847d6a54eb: Status 404 returned error can't find the container with id 7718a309c71ba8a48a463087b2e901f51d954ea050a7be786e3c0a847d6a54eb Mar 16 00:10:49 crc kubenswrapper[4816]: I0316 00:10:49.273173 4816 scope.go:117] "RemoveContainer" containerID="c46a7076608f889c8e30b77b33715ed49c92e64799e40fe88b9f99f6e980f6a5" Mar 16 00:10:49 crc kubenswrapper[4816]: I0316 00:10:49.335835 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 16 00:10:49 crc kubenswrapper[4816]: I0316 00:10:49.360075 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29560320-4hk5d" Mar 16 00:10:49 crc kubenswrapper[4816]: I0316 00:10:49.363945 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 16 00:10:49 crc kubenswrapper[4816]: I0316 00:10:49.392447 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5c90a604-be49-44dc-b350-9df660d8587b-kube-api-access\") pod \"5c90a604-be49-44dc-b350-9df660d8587b\" (UID: \"5c90a604-be49-44dc-b350-9df660d8587b\") " Mar 16 00:10:49 crc kubenswrapper[4816]: I0316 00:10:49.392496 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/5c90a604-be49-44dc-b350-9df660d8587b-kubelet-dir\") pod \"5c90a604-be49-44dc-b350-9df660d8587b\" (UID: \"5c90a604-be49-44dc-b350-9df660d8587b\") " Mar 16 00:10:49 crc kubenswrapper[4816]: I0316 00:10:49.392873 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5c90a604-be49-44dc-b350-9df660d8587b-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "5c90a604-be49-44dc-b350-9df660d8587b" (UID: "5c90a604-be49-44dc-b350-9df660d8587b"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 16 00:10:49 crc kubenswrapper[4816]: I0316 00:10:49.398276 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5c90a604-be49-44dc-b350-9df660d8587b-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "5c90a604-be49-44dc-b350-9df660d8587b" (UID: "5c90a604-be49-44dc-b350-9df660d8587b"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:10:49 crc kubenswrapper[4816]: I0316 00:10:49.494102 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/9397185e-a9e3-4ef4-b0be-d9dc9208adff-config-volume\") pod \"9397185e-a9e3-4ef4-b0be-d9dc9208adff\" (UID: \"9397185e-a9e3-4ef4-b0be-d9dc9208adff\") " Mar 16 00:10:49 crc kubenswrapper[4816]: I0316 00:10:49.494144 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/46c3b4df-48c2-4131-8ac4-ea6276d70d54-kube-api-access\") pod \"46c3b4df-48c2-4131-8ac4-ea6276d70d54\" (UID: \"46c3b4df-48c2-4131-8ac4-ea6276d70d54\") " Mar 16 00:10:49 crc kubenswrapper[4816]: I0316 00:10:49.494168 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pdffs\" (UniqueName: \"kubernetes.io/projected/9397185e-a9e3-4ef4-b0be-d9dc9208adff-kube-api-access-pdffs\") pod \"9397185e-a9e3-4ef4-b0be-d9dc9208adff\" (UID: \"9397185e-a9e3-4ef4-b0be-d9dc9208adff\") " Mar 16 00:10:49 crc kubenswrapper[4816]: I0316 00:10:49.494206 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/46c3b4df-48c2-4131-8ac4-ea6276d70d54-kubelet-dir\") pod \"46c3b4df-48c2-4131-8ac4-ea6276d70d54\" (UID: \"46c3b4df-48c2-4131-8ac4-ea6276d70d54\") " Mar 16 00:10:49 crc kubenswrapper[4816]: I0316 00:10:49.494279 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/9397185e-a9e3-4ef4-b0be-d9dc9208adff-secret-volume\") pod \"9397185e-a9e3-4ef4-b0be-d9dc9208adff\" (UID: \"9397185e-a9e3-4ef4-b0be-d9dc9208adff\") " Mar 16 00:10:49 crc kubenswrapper[4816]: I0316 00:10:49.494586 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5c90a604-be49-44dc-b350-9df660d8587b-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 16 00:10:49 crc kubenswrapper[4816]: I0316 00:10:49.494607 4816 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/5c90a604-be49-44dc-b350-9df660d8587b-kubelet-dir\") on node \"crc\" DevicePath \"\"" Mar 16 00:10:49 crc kubenswrapper[4816]: I0316 00:10:49.495043 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9397185e-a9e3-4ef4-b0be-d9dc9208adff-config-volume" (OuterVolumeSpecName: "config-volume") pod "9397185e-a9e3-4ef4-b0be-d9dc9208adff" (UID: "9397185e-a9e3-4ef4-b0be-d9dc9208adff"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:10:49 crc kubenswrapper[4816]: I0316 00:10:49.495109 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/46c3b4df-48c2-4131-8ac4-ea6276d70d54-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "46c3b4df-48c2-4131-8ac4-ea6276d70d54" (UID: "46c3b4df-48c2-4131-8ac4-ea6276d70d54"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 16 00:10:49 crc kubenswrapper[4816]: I0316 00:10:49.498146 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9397185e-a9e3-4ef4-b0be-d9dc9208adff-kube-api-access-pdffs" (OuterVolumeSpecName: "kube-api-access-pdffs") pod "9397185e-a9e3-4ef4-b0be-d9dc9208adff" (UID: "9397185e-a9e3-4ef4-b0be-d9dc9208adff"). InnerVolumeSpecName "kube-api-access-pdffs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:10:49 crc kubenswrapper[4816]: I0316 00:10:49.498304 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/46c3b4df-48c2-4131-8ac4-ea6276d70d54-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "46c3b4df-48c2-4131-8ac4-ea6276d70d54" (UID: "46c3b4df-48c2-4131-8ac4-ea6276d70d54"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:10:49 crc kubenswrapper[4816]: I0316 00:10:49.498733 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9397185e-a9e3-4ef4-b0be-d9dc9208adff-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "9397185e-a9e3-4ef4-b0be-d9dc9208adff" (UID: "9397185e-a9e3-4ef4-b0be-d9dc9208adff"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 16 00:10:49 crc kubenswrapper[4816]: I0316 00:10:49.555612 4816 patch_prober.go:28] interesting pod/router-default-5444994796-gvk75 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 16 00:10:49 crc kubenswrapper[4816]: [-]has-synced failed: reason withheld Mar 16 00:10:49 crc kubenswrapper[4816]: [+]process-running ok Mar 16 00:10:49 crc kubenswrapper[4816]: healthz check failed Mar 16 00:10:49 crc kubenswrapper[4816]: I0316 00:10:49.556458 4816 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-gvk75" podUID="681ca8e4-f909-4e8b-9f35-5ab8ca382e44" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 16 00:10:49 crc kubenswrapper[4816]: I0316 00:10:49.558603 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29560320-4hk5d" event={"ID":"9397185e-a9e3-4ef4-b0be-d9dc9208adff","Type":"ContainerDied","Data":"b19b5574ead1cf818c519a7ffb8ef773b81e380296fd94d88cb6d44a3be77066"} Mar 16 00:10:49 crc kubenswrapper[4816]: I0316 00:10:49.558648 4816 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b19b5574ead1cf818c519a7ffb8ef773b81e380296fd94d88cb6d44a3be77066" Mar 16 00:10:49 crc kubenswrapper[4816]: I0316 00:10:49.558626 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29560320-4hk5d" Mar 16 00:10:49 crc kubenswrapper[4816]: I0316 00:10:49.561468 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"5c90a604-be49-44dc-b350-9df660d8587b","Type":"ContainerDied","Data":"f6dda169be5b3a08dfad7ad03016a7926ed8e41ada3f5914164856f0c44926f0"} Mar 16 00:10:49 crc kubenswrapper[4816]: I0316 00:10:49.561495 4816 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f6dda169be5b3a08dfad7ad03016a7926ed8e41ada3f5914164856f0c44926f0" Mar 16 00:10:49 crc kubenswrapper[4816]: I0316 00:10:49.561581 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 16 00:10:49 crc kubenswrapper[4816]: I0316 00:10:49.581315 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 16 00:10:49 crc kubenswrapper[4816]: I0316 00:10:49.581307 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"46c3b4df-48c2-4131-8ac4-ea6276d70d54","Type":"ContainerDied","Data":"2c1fef6767b0e1fca9d9b3150317fb66c123ea69c5309a134a4ad450d4919157"} Mar 16 00:10:49 crc kubenswrapper[4816]: I0316 00:10:49.581581 4816 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2c1fef6767b0e1fca9d9b3150317fb66c123ea69c5309a134a4ad450d4919157" Mar 16 00:10:49 crc kubenswrapper[4816]: I0316 00:10:49.588383 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7pb49" event={"ID":"a5ba22dd-8e8e-4beb-a540-e5c9687810b8","Type":"ContainerStarted","Data":"7718a309c71ba8a48a463087b2e901f51d954ea050a7be786e3c0a847d6a54eb"} Mar 16 00:10:49 crc kubenswrapper[4816]: I0316 00:10:49.595988 4816 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/46c3b4df-48c2-4131-8ac4-ea6276d70d54-kubelet-dir\") on node \"crc\" DevicePath \"\"" Mar 16 00:10:49 crc kubenswrapper[4816]: I0316 00:10:49.596017 4816 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/9397185e-a9e3-4ef4-b0be-d9dc9208adff-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 16 00:10:49 crc kubenswrapper[4816]: I0316 00:10:49.596031 4816 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/9397185e-a9e3-4ef4-b0be-d9dc9208adff-config-volume\") on node \"crc\" DevicePath \"\"" Mar 16 00:10:49 crc kubenswrapper[4816]: I0316 00:10:49.596043 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/46c3b4df-48c2-4131-8ac4-ea6276d70d54-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 16 00:10:49 crc kubenswrapper[4816]: I0316 00:10:49.596055 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pdffs\" (UniqueName: \"kubernetes.io/projected/9397185e-a9e3-4ef4-b0be-d9dc9208adff-kube-api-access-pdffs\") on node \"crc\" DevicePath \"\"" Mar 16 00:10:50 crc kubenswrapper[4816]: I0316 00:10:50.427998 4816 ???:1] "http: TLS handshake error from 192.168.126.11:52478: no serving certificate available for the kubelet" Mar 16 00:10:50 crc kubenswrapper[4816]: I0316 00:10:50.559902 4816 patch_prober.go:28] interesting pod/router-default-5444994796-gvk75 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 16 00:10:50 crc kubenswrapper[4816]: [-]has-synced failed: reason withheld Mar 16 00:10:50 crc kubenswrapper[4816]: [+]process-running ok Mar 16 00:10:50 crc kubenswrapper[4816]: healthz check failed Mar 16 00:10:50 crc kubenswrapper[4816]: I0316 00:10:50.559989 4816 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-gvk75" podUID="681ca8e4-f909-4e8b-9f35-5ab8ca382e44" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 16 00:10:50 crc kubenswrapper[4816]: I0316 00:10:50.584232 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-npvts" Mar 16 00:10:51 crc kubenswrapper[4816]: I0316 00:10:51.553632 4816 patch_prober.go:28] interesting pod/router-default-5444994796-gvk75 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 16 00:10:51 crc kubenswrapper[4816]: [-]has-synced failed: reason withheld Mar 16 00:10:51 crc kubenswrapper[4816]: [+]process-running ok Mar 16 00:10:51 crc kubenswrapper[4816]: healthz check failed Mar 16 00:10:51 crc kubenswrapper[4816]: I0316 00:10:51.553689 4816 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-gvk75" podUID="681ca8e4-f909-4e8b-9f35-5ab8ca382e44" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 16 00:10:52 crc kubenswrapper[4816]: I0316 00:10:52.556586 4816 patch_prober.go:28] interesting pod/router-default-5444994796-gvk75 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 16 00:10:52 crc kubenswrapper[4816]: [-]has-synced failed: reason withheld Mar 16 00:10:52 crc kubenswrapper[4816]: [+]process-running ok Mar 16 00:10:52 crc kubenswrapper[4816]: healthz check failed Mar 16 00:10:52 crc kubenswrapper[4816]: I0316 00:10:52.556919 4816 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-gvk75" podUID="681ca8e4-f909-4e8b-9f35-5ab8ca382e44" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 16 00:10:53 crc kubenswrapper[4816]: I0316 00:10:53.246477 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/84360ef9-0450-44c5-80eb-eab1bf8e808b-metrics-certs\") pod \"network-metrics-daemon-jqsjn\" (UID: \"84360ef9-0450-44c5-80eb-eab1bf8e808b\") " pod="openshift-multus/network-metrics-daemon-jqsjn" Mar 16 00:10:53 crc kubenswrapper[4816]: I0316 00:10:53.248233 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Mar 16 00:10:53 crc kubenswrapper[4816]: I0316 00:10:53.267666 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/84360ef9-0450-44c5-80eb-eab1bf8e808b-metrics-certs\") pod \"network-metrics-daemon-jqsjn\" (UID: \"84360ef9-0450-44c5-80eb-eab1bf8e808b\") " pod="openshift-multus/network-metrics-daemon-jqsjn" Mar 16 00:10:53 crc kubenswrapper[4816]: I0316 00:10:53.384527 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Mar 16 00:10:53 crc kubenswrapper[4816]: I0316 00:10:53.393149 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jqsjn" Mar 16 00:10:53 crc kubenswrapper[4816]: I0316 00:10:53.552771 4816 patch_prober.go:28] interesting pod/router-default-5444994796-gvk75 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 16 00:10:53 crc kubenswrapper[4816]: [-]has-synced failed: reason withheld Mar 16 00:10:53 crc kubenswrapper[4816]: [+]process-running ok Mar 16 00:10:53 crc kubenswrapper[4816]: healthz check failed Mar 16 00:10:53 crc kubenswrapper[4816]: I0316 00:10:53.552845 4816 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-gvk75" podUID="681ca8e4-f909-4e8b-9f35-5ab8ca382e44" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 16 00:10:54 crc kubenswrapper[4816]: I0316 00:10:54.420048 4816 patch_prober.go:28] interesting pod/console-f9d7485db-nnqsw container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.5:8443/health\": dial tcp 10.217.0.5:8443: connect: connection refused" start-of-body= Mar 16 00:10:54 crc kubenswrapper[4816]: I0316 00:10:54.420677 4816 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-nnqsw" podUID="32cac2d0-56f0-4ba8-86dc-9b57c0fcc11c" containerName="console" probeResult="failure" output="Get \"https://10.217.0.5:8443/health\": dial tcp 10.217.0.5:8443: connect: connection refused" Mar 16 00:10:54 crc kubenswrapper[4816]: I0316 00:10:54.500254 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-6gbkl"] Mar 16 00:10:54 crc kubenswrapper[4816]: I0316 00:10:54.561721 4816 patch_prober.go:28] interesting pod/router-default-5444994796-gvk75 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 16 00:10:54 crc kubenswrapper[4816]: [-]has-synced failed: reason withheld Mar 16 00:10:54 crc kubenswrapper[4816]: [+]process-running ok Mar 16 00:10:54 crc kubenswrapper[4816]: healthz check failed Mar 16 00:10:54 crc kubenswrapper[4816]: I0316 00:10:54.561793 4816 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-gvk75" podUID="681ca8e4-f909-4e8b-9f35-5ab8ca382e44" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 16 00:10:54 crc kubenswrapper[4816]: I0316 00:10:54.581057 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6d597ffc5b-jhblv"] Mar 16 00:10:54 crc kubenswrapper[4816]: I0316 00:10:54.595518 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-ckvwn"] Mar 16 00:10:54 crc kubenswrapper[4816]: I0316 00:10:54.618856 4816 generic.go:334] "Generic (PLEG): container finished" podID="a5ba22dd-8e8e-4beb-a540-e5c9687810b8" containerID="f01a941d4255e96b1cacdf7b072faf6dd7d1c330c59ed7ec0d337e5fb9c8854f" exitCode=0 Mar 16 00:10:54 crc kubenswrapper[4816]: I0316 00:10:54.618939 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7pb49" event={"ID":"a5ba22dd-8e8e-4beb-a540-e5c9687810b8","Type":"ContainerDied","Data":"f01a941d4255e96b1cacdf7b072faf6dd7d1c330c59ed7ec0d337e5fb9c8854f"} Mar 16 00:10:54 crc kubenswrapper[4816]: I0316 00:10:54.620214 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6gbkl" event={"ID":"bad7b5f7-88a8-4c20-a010-734a46f59e05","Type":"ContainerStarted","Data":"dfccefcb0f8e6864404f0a8715036becb9b7ec4a3aef59dca2da5e935bde36d5"} Mar 16 00:10:54 crc kubenswrapper[4816]: W0316 00:10:54.631372 4816 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod93c061cd_ed29_4f9c_ad3a_0fb204ce6f8d.slice/crio-e26612b919b84db051d8d1f5b5f0b9a5f292e7d09dcad15803116b4fcd5c25d6 WatchSource:0}: Error finding container e26612b919b84db051d8d1f5b5f0b9a5f292e7d09dcad15803116b4fcd5c25d6: Status 404 returned error can't find the container with id e26612b919b84db051d8d1f5b5f0b9a5f292e7d09dcad15803116b4fcd5c25d6 Mar 16 00:10:54 crc kubenswrapper[4816]: I0316 00:10:54.669984 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-64d6fc58d9-vgh6t"] Mar 16 00:10:54 crc kubenswrapper[4816]: I0316 00:10:54.688878 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-52qs6"] Mar 16 00:10:54 crc kubenswrapper[4816]: I0316 00:10:54.736972 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-jqsjn"] Mar 16 00:10:54 crc kubenswrapper[4816]: I0316 00:10:54.747831 4816 patch_prober.go:28] interesting pod/downloads-7954f5f757-5rr7c container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.14:8080/\": dial tcp 10.217.0.14:8080: connect: connection refused" start-of-body= Mar 16 00:10:54 crc kubenswrapper[4816]: I0316 00:10:54.747868 4816 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-5rr7c" podUID="0ec3cdc0-f024-43cf-b520-7d2437e0f8df" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.14:8080/\": dial tcp 10.217.0.14:8080: connect: connection refused" Mar 16 00:10:54 crc kubenswrapper[4816]: I0316 00:10:54.747868 4816 patch_prober.go:28] interesting pod/downloads-7954f5f757-5rr7c container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.14:8080/\": dial tcp 10.217.0.14:8080: connect: connection refused" start-of-body= Mar 16 00:10:54 crc kubenswrapper[4816]: I0316 00:10:54.747956 4816 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-5rr7c" podUID="0ec3cdc0-f024-43cf-b520-7d2437e0f8df" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.14:8080/\": dial tcp 10.217.0.14:8080: connect: connection refused" Mar 16 00:10:54 crc kubenswrapper[4816]: W0316 00:10:54.763135 4816 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod84360ef9_0450_44c5_80eb_eab1bf8e808b.slice/crio-f8a8d5047faf25a14762538ff1be7c473a628d88a02f801a939974322aa4c0fd WatchSource:0}: Error finding container f8a8d5047faf25a14762538ff1be7c473a628d88a02f801a939974322aa4c0fd: Status 404 returned error can't find the container with id f8a8d5047faf25a14762538ff1be7c473a628d88a02f801a939974322aa4c0fd Mar 16 00:10:54 crc kubenswrapper[4816]: I0316 00:10:54.768076 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-hvpqn"] Mar 16 00:10:55 crc kubenswrapper[4816]: I0316 00:10:55.554143 4816 patch_prober.go:28] interesting pod/router-default-5444994796-gvk75 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 16 00:10:55 crc kubenswrapper[4816]: [-]has-synced failed: reason withheld Mar 16 00:10:55 crc kubenswrapper[4816]: [+]process-running ok Mar 16 00:10:55 crc kubenswrapper[4816]: healthz check failed Mar 16 00:10:55 crc kubenswrapper[4816]: I0316 00:10:55.554885 4816 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-gvk75" podUID="681ca8e4-f909-4e8b-9f35-5ab8ca382e44" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 16 00:10:55 crc kubenswrapper[4816]: I0316 00:10:55.629768 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6d597ffc5b-jhblv" event={"ID":"93c061cd-ed29-4f9c-ad3a-0fb204ce6f8d","Type":"ContainerStarted","Data":"8fdaf15dae2b09a126e743d43c57d752450921c814448bf980f67e094859a0df"} Mar 16 00:10:55 crc kubenswrapper[4816]: I0316 00:10:55.629819 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6d597ffc5b-jhblv" event={"ID":"93c061cd-ed29-4f9c-ad3a-0fb204ce6f8d","Type":"ContainerStarted","Data":"e26612b919b84db051d8d1f5b5f0b9a5f292e7d09dcad15803116b4fcd5c25d6"} Mar 16 00:10:55 crc kubenswrapper[4816]: I0316 00:10:55.631350 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-52qs6" event={"ID":"6ca6c2c9-3a12-4eb3-9df1-7fdea640791d","Type":"ContainerStarted","Data":"056d2f945def262f75a277cf64e3b3f1d5e6532d0a76f4258d544edfbbff2503"} Mar 16 00:10:55 crc kubenswrapper[4816]: I0316 00:10:55.631374 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-52qs6" event={"ID":"6ca6c2c9-3a12-4eb3-9df1-7fdea640791d","Type":"ContainerStarted","Data":"47689d47c5b861a3bd4357a2faba7a8ab87d56775475b31d461c37bf8423f524"} Mar 16 00:10:55 crc kubenswrapper[4816]: I0316 00:10:55.635528 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-64d6fc58d9-vgh6t" event={"ID":"1feb17b5-7946-4727-a954-d516a9b8469b","Type":"ContainerStarted","Data":"ed184c9fb61316e6cf511969f5857732bc0cbe98f1fc984c31c588b0377ff308"} Mar 16 00:10:55 crc kubenswrapper[4816]: I0316 00:10:55.635613 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-64d6fc58d9-vgh6t" event={"ID":"1feb17b5-7946-4727-a954-d516a9b8469b","Type":"ContainerStarted","Data":"2b950732b2a5bf6036d818014b94cf2a7cdbaaa448fc7e9ce26ccb0e98f8f687"} Mar 16 00:10:55 crc kubenswrapper[4816]: I0316 00:10:55.637941 4816 generic.go:334] "Generic (PLEG): container finished" podID="bad7b5f7-88a8-4c20-a010-734a46f59e05" containerID="5259cd97d29c896bcf8ba7141fe44641e990295b28288f54dfe4315de536ad23" exitCode=0 Mar 16 00:10:55 crc kubenswrapper[4816]: I0316 00:10:55.637996 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6gbkl" event={"ID":"bad7b5f7-88a8-4c20-a010-734a46f59e05","Type":"ContainerDied","Data":"5259cd97d29c896bcf8ba7141fe44641e990295b28288f54dfe4315de536ad23"} Mar 16 00:10:55 crc kubenswrapper[4816]: I0316 00:10:55.641075 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-ckvwn" event={"ID":"b155133b-d494-44bc-aa5d-23efc7cbd7a6","Type":"ContainerStarted","Data":"4318cca39cc5c9226ea995c847972dcda126bf000cf9cc35ad09a9cdabdc663b"} Mar 16 00:10:55 crc kubenswrapper[4816]: I0316 00:10:55.641120 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-ckvwn" event={"ID":"b155133b-d494-44bc-aa5d-23efc7cbd7a6","Type":"ContainerStarted","Data":"e368502f9ca177437add127848813e2ad33e96c185b8ab726042b2878dcec995"} Mar 16 00:10:55 crc kubenswrapper[4816]: I0316 00:10:55.643724 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-jqsjn" event={"ID":"84360ef9-0450-44c5-80eb-eab1bf8e808b","Type":"ContainerStarted","Data":"af04d6d535b6c5f27d1a59ca3be2b5e9cd465bd48de2b8ba8e2eb0e281a5d8ac"} Mar 16 00:10:55 crc kubenswrapper[4816]: I0316 00:10:55.643753 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-jqsjn" event={"ID":"84360ef9-0450-44c5-80eb-eab1bf8e808b","Type":"ContainerStarted","Data":"f8a8d5047faf25a14762538ff1be7c473a628d88a02f801a939974322aa4c0fd"} Mar 16 00:10:55 crc kubenswrapper[4816]: I0316 00:10:55.647432 4816 generic.go:334] "Generic (PLEG): container finished" podID="cc1ea93d-1cf8-4145-ad35-83f2d1357f9d" containerID="d3d02136defedca51b696822546773a5d6f3e05f0581bc5504bae4a17393efcc" exitCode=0 Mar 16 00:10:55 crc kubenswrapper[4816]: I0316 00:10:55.647475 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hvpqn" event={"ID":"cc1ea93d-1cf8-4145-ad35-83f2d1357f9d","Type":"ContainerDied","Data":"d3d02136defedca51b696822546773a5d6f3e05f0581bc5504bae4a17393efcc"} Mar 16 00:10:55 crc kubenswrapper[4816]: I0316 00:10:55.647504 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hvpqn" event={"ID":"cc1ea93d-1cf8-4145-ad35-83f2d1357f9d","Type":"ContainerStarted","Data":"a7d840d19860a5867af8d4206630041069552968b0c74710a21974d2b8f8f661"} Mar 16 00:10:56 crc kubenswrapper[4816]: I0316 00:10:56.552056 4816 patch_prober.go:28] interesting pod/router-default-5444994796-gvk75 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 16 00:10:56 crc kubenswrapper[4816]: [-]has-synced failed: reason withheld Mar 16 00:10:56 crc kubenswrapper[4816]: [+]process-running ok Mar 16 00:10:56 crc kubenswrapper[4816]: healthz check failed Mar 16 00:10:56 crc kubenswrapper[4816]: I0316 00:10:56.552123 4816 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-gvk75" podUID="681ca8e4-f909-4e8b-9f35-5ab8ca382e44" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 16 00:10:56 crc kubenswrapper[4816]: I0316 00:10:56.656529 4816 generic.go:334] "Generic (PLEG): container finished" podID="6ca6c2c9-3a12-4eb3-9df1-7fdea640791d" containerID="056d2f945def262f75a277cf64e3b3f1d5e6532d0a76f4258d544edfbbff2503" exitCode=0 Mar 16 00:10:56 crc kubenswrapper[4816]: I0316 00:10:56.656604 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-52qs6" event={"ID":"6ca6c2c9-3a12-4eb3-9df1-7fdea640791d","Type":"ContainerDied","Data":"056d2f945def262f75a277cf64e3b3f1d5e6532d0a76f4258d544edfbbff2503"} Mar 16 00:10:56 crc kubenswrapper[4816]: I0316 00:10:56.657083 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-697d97f7c8-ckvwn" Mar 16 00:10:56 crc kubenswrapper[4816]: I0316 00:10:56.675979 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-64d6fc58d9-vgh6t" podStartSLOduration=13.675958219 podStartE2EDuration="13.675958219s" podCreationTimestamp="2026-03-16 00:10:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-16 00:10:56.673928353 +0000 UTC m=+249.770228326" watchObservedRunningTime="2026-03-16 00:10:56.675958219 +0000 UTC m=+249.772258172" Mar 16 00:10:56 crc kubenswrapper[4816]: I0316 00:10:56.716741 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-697d97f7c8-ckvwn" podStartSLOduration=198.716718481 podStartE2EDuration="3m18.716718481s" podCreationTimestamp="2026-03-16 00:07:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-16 00:10:56.714326976 +0000 UTC m=+249.810626949" watchObservedRunningTime="2026-03-16 00:10:56.716718481 +0000 UTC m=+249.813018434" Mar 16 00:10:56 crc kubenswrapper[4816]: I0316 00:10:56.772592 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6d597ffc5b-jhblv" podStartSLOduration=13.772574126 podStartE2EDuration="13.772574126s" podCreationTimestamp="2026-03-16 00:10:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-16 00:10:56.770073358 +0000 UTC m=+249.866373311" watchObservedRunningTime="2026-03-16 00:10:56.772574126 +0000 UTC m=+249.868874079" Mar 16 00:10:57 crc kubenswrapper[4816]: I0316 00:10:57.555012 4816 patch_prober.go:28] interesting pod/router-default-5444994796-gvk75 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 16 00:10:57 crc kubenswrapper[4816]: [-]has-synced failed: reason withheld Mar 16 00:10:57 crc kubenswrapper[4816]: [+]process-running ok Mar 16 00:10:57 crc kubenswrapper[4816]: healthz check failed Mar 16 00:10:57 crc kubenswrapper[4816]: I0316 00:10:57.555365 4816 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-gvk75" podUID="681ca8e4-f909-4e8b-9f35-5ab8ca382e44" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 16 00:10:57 crc kubenswrapper[4816]: I0316 00:10:57.698270 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-jqsjn" event={"ID":"84360ef9-0450-44c5-80eb-eab1bf8e808b","Type":"ContainerStarted","Data":"c025ab66399e1eec77c882d25daa4e18498f64990e5075d55e63826832d6af3d"} Mar 16 00:10:57 crc kubenswrapper[4816]: I0316 00:10:57.712988 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29560330-44pts" event={"ID":"55e76e8f-7d69-4f55-81f8-45c9c612876b","Type":"ContainerStarted","Data":"a0546877ac51e8fef907f2152b03530a1aaadfb1ec0bb2cad119c19beb5651ba"} Mar 16 00:10:57 crc kubenswrapper[4816]: I0316 00:10:57.756736 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29560330-44pts" podStartSLOduration=36.693446902 podStartE2EDuration="57.756702386s" podCreationTimestamp="2026-03-16 00:10:00 +0000 UTC" firstStartedPulling="2026-03-16 00:10:35.702247599 +0000 UTC m=+228.798547552" lastFinishedPulling="2026-03-16 00:10:56.765503093 +0000 UTC m=+249.861803036" observedRunningTime="2026-03-16 00:10:57.755651037 +0000 UTC m=+250.851950990" watchObservedRunningTime="2026-03-16 00:10:57.756702386 +0000 UTC m=+250.853002339" Mar 16 00:10:57 crc kubenswrapper[4816]: I0316 00:10:57.772835 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-jqsjn" podStartSLOduration=200.772814736 podStartE2EDuration="3m20.772814736s" podCreationTimestamp="2026-03-16 00:07:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-16 00:10:57.770612726 +0000 UTC m=+250.866912699" watchObservedRunningTime="2026-03-16 00:10:57.772814736 +0000 UTC m=+250.869114689" Mar 16 00:10:57 crc kubenswrapper[4816]: I0316 00:10:57.972261 4816 csr.go:261] certificate signing request csr-kjjbq is approved, waiting to be issued Mar 16 00:10:57 crc kubenswrapper[4816]: I0316 00:10:57.980628 4816 csr.go:257] certificate signing request csr-kjjbq is issued Mar 16 00:10:58 crc kubenswrapper[4816]: I0316 00:10:58.554062 4816 patch_prober.go:28] interesting pod/router-default-5444994796-gvk75 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 16 00:10:58 crc kubenswrapper[4816]: [-]has-synced failed: reason withheld Mar 16 00:10:58 crc kubenswrapper[4816]: [+]process-running ok Mar 16 00:10:58 crc kubenswrapper[4816]: healthz check failed Mar 16 00:10:58 crc kubenswrapper[4816]: I0316 00:10:58.554122 4816 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-gvk75" podUID="681ca8e4-f909-4e8b-9f35-5ab8ca382e44" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 16 00:10:58 crc kubenswrapper[4816]: I0316 00:10:58.723606 4816 generic.go:334] "Generic (PLEG): container finished" podID="55e76e8f-7d69-4f55-81f8-45c9c612876b" containerID="a0546877ac51e8fef907f2152b03530a1aaadfb1ec0bb2cad119c19beb5651ba" exitCode=0 Mar 16 00:10:58 crc kubenswrapper[4816]: I0316 00:10:58.723668 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29560330-44pts" event={"ID":"55e76e8f-7d69-4f55-81f8-45c9c612876b","Type":"ContainerDied","Data":"a0546877ac51e8fef907f2152b03530a1aaadfb1ec0bb2cad119c19beb5651ba"} Mar 16 00:10:58 crc kubenswrapper[4816]: I0316 00:10:58.981684 4816 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2027-02-24 05:54:36 +0000 UTC, rotation deadline is 2026-12-30 23:38:08.310272768 +0000 UTC Mar 16 00:10:58 crc kubenswrapper[4816]: I0316 00:10:58.981723 4816 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 6959h27m9.328552206s for next certificate rotation Mar 16 00:10:59 crc kubenswrapper[4816]: I0316 00:10:59.553783 4816 patch_prober.go:28] interesting pod/router-default-5444994796-gvk75 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 16 00:10:59 crc kubenswrapper[4816]: [-]has-synced failed: reason withheld Mar 16 00:10:59 crc kubenswrapper[4816]: [+]process-running ok Mar 16 00:10:59 crc kubenswrapper[4816]: healthz check failed Mar 16 00:10:59 crc kubenswrapper[4816]: I0316 00:10:59.553840 4816 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-gvk75" podUID="681ca8e4-f909-4e8b-9f35-5ab8ca382e44" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 16 00:10:59 crc kubenswrapper[4816]: I0316 00:10:59.982593 4816 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2027-02-24 05:54:36 +0000 UTC, rotation deadline is 2026-11-29 11:23:32.366681093 +0000 UTC Mar 16 00:10:59 crc kubenswrapper[4816]: I0316 00:10:59.982633 4816 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 6203h12m32.384051423s for next certificate rotation Mar 16 00:11:00 crc kubenswrapper[4816]: I0316 00:11:00.053345 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29560330-44pts" Mar 16 00:11:00 crc kubenswrapper[4816]: I0316 00:11:00.153659 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dq52v\" (UniqueName: \"kubernetes.io/projected/55e76e8f-7d69-4f55-81f8-45c9c612876b-kube-api-access-dq52v\") pod \"55e76e8f-7d69-4f55-81f8-45c9c612876b\" (UID: \"55e76e8f-7d69-4f55-81f8-45c9c612876b\") " Mar 16 00:11:00 crc kubenswrapper[4816]: I0316 00:11:00.160775 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/55e76e8f-7d69-4f55-81f8-45c9c612876b-kube-api-access-dq52v" (OuterVolumeSpecName: "kube-api-access-dq52v") pod "55e76e8f-7d69-4f55-81f8-45c9c612876b" (UID: "55e76e8f-7d69-4f55-81f8-45c9c612876b"). InnerVolumeSpecName "kube-api-access-dq52v". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:11:00 crc kubenswrapper[4816]: I0316 00:11:00.254890 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dq52v\" (UniqueName: \"kubernetes.io/projected/55e76e8f-7d69-4f55-81f8-45c9c612876b-kube-api-access-dq52v\") on node \"crc\" DevicePath \"\"" Mar 16 00:11:00 crc kubenswrapper[4816]: I0316 00:11:00.554560 4816 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-5444994796-gvk75" Mar 16 00:11:00 crc kubenswrapper[4816]: I0316 00:11:00.558587 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-5444994796-gvk75" Mar 16 00:11:00 crc kubenswrapper[4816]: I0316 00:11:00.746262 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29560330-44pts" Mar 16 00:11:00 crc kubenswrapper[4816]: I0316 00:11:00.746278 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29560330-44pts" event={"ID":"55e76e8f-7d69-4f55-81f8-45c9c612876b","Type":"ContainerDied","Data":"14379482594ebf801c25583d0aab03c78f3555265f22f25f8cbeb498177ecef2"} Mar 16 00:11:00 crc kubenswrapper[4816]: I0316 00:11:00.746338 4816 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="14379482594ebf801c25583d0aab03c78f3555265f22f25f8cbeb498177ecef2" Mar 16 00:11:01 crc kubenswrapper[4816]: I0316 00:11:01.863033 4816 patch_prober.go:28] interesting pod/machine-config-daemon-jrdcz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 16 00:11:01 crc kubenswrapper[4816]: I0316 00:11:01.863087 4816 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jrdcz" podUID="dd08ece2-7636-4966-973a-e96a34b70b53" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 16 00:11:01 crc kubenswrapper[4816]: I0316 00:11:01.886477 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-64d6fc58d9-vgh6t"] Mar 16 00:11:01 crc kubenswrapper[4816]: I0316 00:11:01.887057 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-64d6fc58d9-vgh6t" podUID="1feb17b5-7946-4727-a954-d516a9b8469b" containerName="controller-manager" containerID="cri-o://ed184c9fb61316e6cf511969f5857732bc0cbe98f1fc984c31c588b0377ff308" gracePeriod=30 Mar 16 00:11:01 crc kubenswrapper[4816]: I0316 00:11:01.890011 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-64d6fc58d9-vgh6t" Mar 16 00:11:01 crc kubenswrapper[4816]: I0316 00:11:01.893158 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6d597ffc5b-jhblv"] Mar 16 00:11:01 crc kubenswrapper[4816]: I0316 00:11:01.893408 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6d597ffc5b-jhblv" podUID="93c061cd-ed29-4f9c-ad3a-0fb204ce6f8d" containerName="route-controller-manager" containerID="cri-o://8fdaf15dae2b09a126e743d43c57d752450921c814448bf980f67e094859a0df" gracePeriod=30 Mar 16 00:11:01 crc kubenswrapper[4816]: I0316 00:11:01.897737 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6d597ffc5b-jhblv" Mar 16 00:11:01 crc kubenswrapper[4816]: I0316 00:11:01.910852 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6d597ffc5b-jhblv" Mar 16 00:11:01 crc kubenswrapper[4816]: I0316 00:11:01.926931 4816 patch_prober.go:28] interesting pod/controller-manager-64d6fc58d9-vgh6t container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.54:8443/healthz\": read tcp 10.217.0.2:54684->10.217.0.54:8443: read: connection reset by peer" start-of-body= Mar 16 00:11:01 crc kubenswrapper[4816]: I0316 00:11:01.926991 4816 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-64d6fc58d9-vgh6t" podUID="1feb17b5-7946-4727-a954-d516a9b8469b" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.54:8443/healthz\": read tcp 10.217.0.2:54684->10.217.0.54:8443: read: connection reset by peer" Mar 16 00:11:02 crc kubenswrapper[4816]: I0316 00:11:02.760480 4816 generic.go:334] "Generic (PLEG): container finished" podID="1feb17b5-7946-4727-a954-d516a9b8469b" containerID="ed184c9fb61316e6cf511969f5857732bc0cbe98f1fc984c31c588b0377ff308" exitCode=0 Mar 16 00:11:02 crc kubenswrapper[4816]: I0316 00:11:02.760587 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-64d6fc58d9-vgh6t" event={"ID":"1feb17b5-7946-4727-a954-d516a9b8469b","Type":"ContainerDied","Data":"ed184c9fb61316e6cf511969f5857732bc0cbe98f1fc984c31c588b0377ff308"} Mar 16 00:11:02 crc kubenswrapper[4816]: I0316 00:11:02.762813 4816 generic.go:334] "Generic (PLEG): container finished" podID="93c061cd-ed29-4f9c-ad3a-0fb204ce6f8d" containerID="8fdaf15dae2b09a126e743d43c57d752450921c814448bf980f67e094859a0df" exitCode=0 Mar 16 00:11:02 crc kubenswrapper[4816]: I0316 00:11:02.762841 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6d597ffc5b-jhblv" event={"ID":"93c061cd-ed29-4f9c-ad3a-0fb204ce6f8d","Type":"ContainerDied","Data":"8fdaf15dae2b09a126e743d43c57d752450921c814448bf980f67e094859a0df"} Mar 16 00:11:04 crc kubenswrapper[4816]: I0316 00:11:04.419263 4816 patch_prober.go:28] interesting pod/console-f9d7485db-nnqsw container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.5:8443/health\": dial tcp 10.217.0.5:8443: connect: connection refused" start-of-body= Mar 16 00:11:04 crc kubenswrapper[4816]: I0316 00:11:04.419324 4816 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-nnqsw" podUID="32cac2d0-56f0-4ba8-86dc-9b57c0fcc11c" containerName="console" probeResult="failure" output="Get \"https://10.217.0.5:8443/health\": dial tcp 10.217.0.5:8443: connect: connection refused" Mar 16 00:11:04 crc kubenswrapper[4816]: I0316 00:11:04.749012 4816 patch_prober.go:28] interesting pod/downloads-7954f5f757-5rr7c container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.14:8080/\": dial tcp 10.217.0.14:8080: connect: connection refused" start-of-body= Mar 16 00:11:04 crc kubenswrapper[4816]: I0316 00:11:04.749050 4816 patch_prober.go:28] interesting pod/downloads-7954f5f757-5rr7c container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.14:8080/\": dial tcp 10.217.0.14:8080: connect: connection refused" start-of-body= Mar 16 00:11:04 crc kubenswrapper[4816]: I0316 00:11:04.749078 4816 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-5rr7c" podUID="0ec3cdc0-f024-43cf-b520-7d2437e0f8df" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.14:8080/\": dial tcp 10.217.0.14:8080: connect: connection refused" Mar 16 00:11:04 crc kubenswrapper[4816]: I0316 00:11:04.749152 4816 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-5rr7c" podUID="0ec3cdc0-f024-43cf-b520-7d2437e0f8df" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.14:8080/\": dial tcp 10.217.0.14:8080: connect: connection refused" Mar 16 00:11:04 crc kubenswrapper[4816]: I0316 00:11:04.749234 4816 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-console/downloads-7954f5f757-5rr7c" Mar 16 00:11:04 crc kubenswrapper[4816]: I0316 00:11:04.749952 4816 patch_prober.go:28] interesting pod/downloads-7954f5f757-5rr7c container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.14:8080/\": dial tcp 10.217.0.14:8080: connect: connection refused" start-of-body= Mar 16 00:11:04 crc kubenswrapper[4816]: I0316 00:11:04.750169 4816 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-5rr7c" podUID="0ec3cdc0-f024-43cf-b520-7d2437e0f8df" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.14:8080/\": dial tcp 10.217.0.14:8080: connect: connection refused" Mar 16 00:11:04 crc kubenswrapper[4816]: I0316 00:11:04.750413 4816 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="download-server" containerStatusID={"Type":"cri-o","ID":"9e1758947a169fa8c89c8e3873ca56d930c8ca55c7c143100afca371ccc218fc"} pod="openshift-console/downloads-7954f5f757-5rr7c" containerMessage="Container download-server failed liveness probe, will be restarted" Mar 16 00:11:04 crc kubenswrapper[4816]: I0316 00:11:04.750772 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/downloads-7954f5f757-5rr7c" podUID="0ec3cdc0-f024-43cf-b520-7d2437e0f8df" containerName="download-server" containerID="cri-o://9e1758947a169fa8c89c8e3873ca56d930c8ca55c7c143100afca371ccc218fc" gracePeriod=2 Mar 16 00:11:05 crc kubenswrapper[4816]: I0316 00:11:05.707415 4816 patch_prober.go:28] interesting pod/route-controller-manager-6d597ffc5b-jhblv container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.55:8443/healthz\": dial tcp 10.217.0.55:8443: connect: connection refused" start-of-body= Mar 16 00:11:05 crc kubenswrapper[4816]: I0316 00:11:05.707777 4816 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6d597ffc5b-jhblv" podUID="93c061cd-ed29-4f9c-ad3a-0fb204ce6f8d" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.55:8443/healthz\": dial tcp 10.217.0.55:8443: connect: connection refused" Mar 16 00:11:05 crc kubenswrapper[4816]: I0316 00:11:05.782819 4816 generic.go:334] "Generic (PLEG): container finished" podID="0ec3cdc0-f024-43cf-b520-7d2437e0f8df" containerID="9e1758947a169fa8c89c8e3873ca56d930c8ca55c7c143100afca371ccc218fc" exitCode=0 Mar 16 00:11:05 crc kubenswrapper[4816]: I0316 00:11:05.782864 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-5rr7c" event={"ID":"0ec3cdc0-f024-43cf-b520-7d2437e0f8df","Type":"ContainerDied","Data":"9e1758947a169fa8c89c8e3873ca56d930c8ca55c7c143100afca371ccc218fc"} Mar 16 00:11:06 crc kubenswrapper[4816]: I0316 00:11:06.701642 4816 patch_prober.go:28] interesting pod/controller-manager-64d6fc58d9-vgh6t container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.54:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 16 00:11:06 crc kubenswrapper[4816]: I0316 00:11:06.701791 4816 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-64d6fc58d9-vgh6t" podUID="1feb17b5-7946-4727-a954-d516a9b8469b" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.54:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 16 00:11:14 crc kubenswrapper[4816]: I0316 00:11:14.423798 4816 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-f9d7485db-nnqsw" Mar 16 00:11:14 crc kubenswrapper[4816]: I0316 00:11:14.432060 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-f9d7485db-nnqsw" Mar 16 00:11:14 crc kubenswrapper[4816]: I0316 00:11:14.747970 4816 patch_prober.go:28] interesting pod/downloads-7954f5f757-5rr7c container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.14:8080/\": dial tcp 10.217.0.14:8080: connect: connection refused" start-of-body= Mar 16 00:11:14 crc kubenswrapper[4816]: I0316 00:11:14.748331 4816 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-5rr7c" podUID="0ec3cdc0-f024-43cf-b520-7d2437e0f8df" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.14:8080/\": dial tcp 10.217.0.14:8080: connect: connection refused" Mar 16 00:11:15 crc kubenswrapper[4816]: I0316 00:11:15.167733 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-jm8db" Mar 16 00:11:15 crc kubenswrapper[4816]: I0316 00:11:15.245997 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-64d6fc58d9-vgh6t" Mar 16 00:11:15 crc kubenswrapper[4816]: I0316 00:11:15.283970 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-7dc5f6d8dc-9cs6b"] Mar 16 00:11:15 crc kubenswrapper[4816]: E0316 00:11:15.284243 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5c90a604-be49-44dc-b350-9df660d8587b" containerName="pruner" Mar 16 00:11:15 crc kubenswrapper[4816]: I0316 00:11:15.284259 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="5c90a604-be49-44dc-b350-9df660d8587b" containerName="pruner" Mar 16 00:11:15 crc kubenswrapper[4816]: E0316 00:11:15.284273 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="46c3b4df-48c2-4131-8ac4-ea6276d70d54" containerName="pruner" Mar 16 00:11:15 crc kubenswrapper[4816]: I0316 00:11:15.284282 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="46c3b4df-48c2-4131-8ac4-ea6276d70d54" containerName="pruner" Mar 16 00:11:15 crc kubenswrapper[4816]: E0316 00:11:15.284302 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="55e76e8f-7d69-4f55-81f8-45c9c612876b" containerName="oc" Mar 16 00:11:15 crc kubenswrapper[4816]: I0316 00:11:15.284311 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="55e76e8f-7d69-4f55-81f8-45c9c612876b" containerName="oc" Mar 16 00:11:15 crc kubenswrapper[4816]: E0316 00:11:15.284325 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9397185e-a9e3-4ef4-b0be-d9dc9208adff" containerName="collect-profiles" Mar 16 00:11:15 crc kubenswrapper[4816]: I0316 00:11:15.284333 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="9397185e-a9e3-4ef4-b0be-d9dc9208adff" containerName="collect-profiles" Mar 16 00:11:15 crc kubenswrapper[4816]: E0316 00:11:15.284342 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1feb17b5-7946-4727-a954-d516a9b8469b" containerName="controller-manager" Mar 16 00:11:15 crc kubenswrapper[4816]: I0316 00:11:15.284350 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="1feb17b5-7946-4727-a954-d516a9b8469b" containerName="controller-manager" Mar 16 00:11:15 crc kubenswrapper[4816]: I0316 00:11:15.284468 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="5c90a604-be49-44dc-b350-9df660d8587b" containerName="pruner" Mar 16 00:11:15 crc kubenswrapper[4816]: I0316 00:11:15.284483 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="1feb17b5-7946-4727-a954-d516a9b8469b" containerName="controller-manager" Mar 16 00:11:15 crc kubenswrapper[4816]: I0316 00:11:15.284496 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="55e76e8f-7d69-4f55-81f8-45c9c612876b" containerName="oc" Mar 16 00:11:15 crc kubenswrapper[4816]: I0316 00:11:15.284510 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="46c3b4df-48c2-4131-8ac4-ea6276d70d54" containerName="pruner" Mar 16 00:11:15 crc kubenswrapper[4816]: I0316 00:11:15.284524 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="9397185e-a9e3-4ef4-b0be-d9dc9208adff" containerName="collect-profiles" Mar 16 00:11:15 crc kubenswrapper[4816]: I0316 00:11:15.292330 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-697d97f7c8-ckvwn" Mar 16 00:11:15 crc kubenswrapper[4816]: I0316 00:11:15.292373 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-7dc5f6d8dc-9cs6b"] Mar 16 00:11:15 crc kubenswrapper[4816]: I0316 00:11:15.293223 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7dc5f6d8dc-9cs6b" Mar 16 00:11:15 crc kubenswrapper[4816]: I0316 00:11:15.401319 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/1feb17b5-7946-4727-a954-d516a9b8469b-client-ca\") pod \"1feb17b5-7946-4727-a954-d516a9b8469b\" (UID: \"1feb17b5-7946-4727-a954-d516a9b8469b\") " Mar 16 00:11:15 crc kubenswrapper[4816]: I0316 00:11:15.401369 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1feb17b5-7946-4727-a954-d516a9b8469b-serving-cert\") pod \"1feb17b5-7946-4727-a954-d516a9b8469b\" (UID: \"1feb17b5-7946-4727-a954-d516a9b8469b\") " Mar 16 00:11:15 crc kubenswrapper[4816]: I0316 00:11:15.401437 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jnvd7\" (UniqueName: \"kubernetes.io/projected/1feb17b5-7946-4727-a954-d516a9b8469b-kube-api-access-jnvd7\") pod \"1feb17b5-7946-4727-a954-d516a9b8469b\" (UID: \"1feb17b5-7946-4727-a954-d516a9b8469b\") " Mar 16 00:11:15 crc kubenswrapper[4816]: I0316 00:11:15.401479 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1feb17b5-7946-4727-a954-d516a9b8469b-config\") pod \"1feb17b5-7946-4727-a954-d516a9b8469b\" (UID: \"1feb17b5-7946-4727-a954-d516a9b8469b\") " Mar 16 00:11:15 crc kubenswrapper[4816]: I0316 00:11:15.401505 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/1feb17b5-7946-4727-a954-d516a9b8469b-proxy-ca-bundles\") pod \"1feb17b5-7946-4727-a954-d516a9b8469b\" (UID: \"1feb17b5-7946-4727-a954-d516a9b8469b\") " Mar 16 00:11:15 crc kubenswrapper[4816]: I0316 00:11:15.401728 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/a8a656b3-b809-44ee-bcc8-cfec2ae27ec7-proxy-ca-bundles\") pod \"controller-manager-7dc5f6d8dc-9cs6b\" (UID: \"a8a656b3-b809-44ee-bcc8-cfec2ae27ec7\") " pod="openshift-controller-manager/controller-manager-7dc5f6d8dc-9cs6b" Mar 16 00:11:15 crc kubenswrapper[4816]: I0316 00:11:15.401768 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hl55f\" (UniqueName: \"kubernetes.io/projected/a8a656b3-b809-44ee-bcc8-cfec2ae27ec7-kube-api-access-hl55f\") pod \"controller-manager-7dc5f6d8dc-9cs6b\" (UID: \"a8a656b3-b809-44ee-bcc8-cfec2ae27ec7\") " pod="openshift-controller-manager/controller-manager-7dc5f6d8dc-9cs6b" Mar 16 00:11:15 crc kubenswrapper[4816]: I0316 00:11:15.402034 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a8a656b3-b809-44ee-bcc8-cfec2ae27ec7-client-ca\") pod \"controller-manager-7dc5f6d8dc-9cs6b\" (UID: \"a8a656b3-b809-44ee-bcc8-cfec2ae27ec7\") " pod="openshift-controller-manager/controller-manager-7dc5f6d8dc-9cs6b" Mar 16 00:11:15 crc kubenswrapper[4816]: I0316 00:11:15.402063 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a8a656b3-b809-44ee-bcc8-cfec2ae27ec7-serving-cert\") pod \"controller-manager-7dc5f6d8dc-9cs6b\" (UID: \"a8a656b3-b809-44ee-bcc8-cfec2ae27ec7\") " pod="openshift-controller-manager/controller-manager-7dc5f6d8dc-9cs6b" Mar 16 00:11:15 crc kubenswrapper[4816]: I0316 00:11:15.402137 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a8a656b3-b809-44ee-bcc8-cfec2ae27ec7-config\") pod \"controller-manager-7dc5f6d8dc-9cs6b\" (UID: \"a8a656b3-b809-44ee-bcc8-cfec2ae27ec7\") " pod="openshift-controller-manager/controller-manager-7dc5f6d8dc-9cs6b" Mar 16 00:11:15 crc kubenswrapper[4816]: I0316 00:11:15.402219 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1feb17b5-7946-4727-a954-d516a9b8469b-client-ca" (OuterVolumeSpecName: "client-ca") pod "1feb17b5-7946-4727-a954-d516a9b8469b" (UID: "1feb17b5-7946-4727-a954-d516a9b8469b"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:11:15 crc kubenswrapper[4816]: I0316 00:11:15.402296 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1feb17b5-7946-4727-a954-d516a9b8469b-config" (OuterVolumeSpecName: "config") pod "1feb17b5-7946-4727-a954-d516a9b8469b" (UID: "1feb17b5-7946-4727-a954-d516a9b8469b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:11:15 crc kubenswrapper[4816]: I0316 00:11:15.402471 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1feb17b5-7946-4727-a954-d516a9b8469b-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "1feb17b5-7946-4727-a954-d516a9b8469b" (UID: "1feb17b5-7946-4727-a954-d516a9b8469b"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:11:15 crc kubenswrapper[4816]: I0316 00:11:15.406394 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1feb17b5-7946-4727-a954-d516a9b8469b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1feb17b5-7946-4727-a954-d516a9b8469b" (UID: "1feb17b5-7946-4727-a954-d516a9b8469b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 16 00:11:15 crc kubenswrapper[4816]: I0316 00:11:15.407527 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1feb17b5-7946-4727-a954-d516a9b8469b-kube-api-access-jnvd7" (OuterVolumeSpecName: "kube-api-access-jnvd7") pod "1feb17b5-7946-4727-a954-d516a9b8469b" (UID: "1feb17b5-7946-4727-a954-d516a9b8469b"). InnerVolumeSpecName "kube-api-access-jnvd7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:11:15 crc kubenswrapper[4816]: I0316 00:11:15.503970 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/a8a656b3-b809-44ee-bcc8-cfec2ae27ec7-proxy-ca-bundles\") pod \"controller-manager-7dc5f6d8dc-9cs6b\" (UID: \"a8a656b3-b809-44ee-bcc8-cfec2ae27ec7\") " pod="openshift-controller-manager/controller-manager-7dc5f6d8dc-9cs6b" Mar 16 00:11:15 crc kubenswrapper[4816]: I0316 00:11:15.504038 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hl55f\" (UniqueName: \"kubernetes.io/projected/a8a656b3-b809-44ee-bcc8-cfec2ae27ec7-kube-api-access-hl55f\") pod \"controller-manager-7dc5f6d8dc-9cs6b\" (UID: \"a8a656b3-b809-44ee-bcc8-cfec2ae27ec7\") " pod="openshift-controller-manager/controller-manager-7dc5f6d8dc-9cs6b" Mar 16 00:11:15 crc kubenswrapper[4816]: I0316 00:11:15.504077 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a8a656b3-b809-44ee-bcc8-cfec2ae27ec7-client-ca\") pod \"controller-manager-7dc5f6d8dc-9cs6b\" (UID: \"a8a656b3-b809-44ee-bcc8-cfec2ae27ec7\") " pod="openshift-controller-manager/controller-manager-7dc5f6d8dc-9cs6b" Mar 16 00:11:15 crc kubenswrapper[4816]: I0316 00:11:15.504106 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a8a656b3-b809-44ee-bcc8-cfec2ae27ec7-serving-cert\") pod \"controller-manager-7dc5f6d8dc-9cs6b\" (UID: \"a8a656b3-b809-44ee-bcc8-cfec2ae27ec7\") " pod="openshift-controller-manager/controller-manager-7dc5f6d8dc-9cs6b" Mar 16 00:11:15 crc kubenswrapper[4816]: I0316 00:11:15.504159 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a8a656b3-b809-44ee-bcc8-cfec2ae27ec7-config\") pod \"controller-manager-7dc5f6d8dc-9cs6b\" (UID: \"a8a656b3-b809-44ee-bcc8-cfec2ae27ec7\") " pod="openshift-controller-manager/controller-manager-7dc5f6d8dc-9cs6b" Mar 16 00:11:15 crc kubenswrapper[4816]: I0316 00:11:15.504236 4816 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/1feb17b5-7946-4727-a954-d516a9b8469b-client-ca\") on node \"crc\" DevicePath \"\"" Mar 16 00:11:15 crc kubenswrapper[4816]: I0316 00:11:15.504250 4816 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1feb17b5-7946-4727-a954-d516a9b8469b-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 16 00:11:15 crc kubenswrapper[4816]: I0316 00:11:15.504264 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jnvd7\" (UniqueName: \"kubernetes.io/projected/1feb17b5-7946-4727-a954-d516a9b8469b-kube-api-access-jnvd7\") on node \"crc\" DevicePath \"\"" Mar 16 00:11:15 crc kubenswrapper[4816]: I0316 00:11:15.504277 4816 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1feb17b5-7946-4727-a954-d516a9b8469b-config\") on node \"crc\" DevicePath \"\"" Mar 16 00:11:15 crc kubenswrapper[4816]: I0316 00:11:15.504289 4816 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/1feb17b5-7946-4727-a954-d516a9b8469b-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 16 00:11:15 crc kubenswrapper[4816]: I0316 00:11:15.507481 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a8a656b3-b809-44ee-bcc8-cfec2ae27ec7-client-ca\") pod \"controller-manager-7dc5f6d8dc-9cs6b\" (UID: \"a8a656b3-b809-44ee-bcc8-cfec2ae27ec7\") " pod="openshift-controller-manager/controller-manager-7dc5f6d8dc-9cs6b" Mar 16 00:11:15 crc kubenswrapper[4816]: I0316 00:11:15.508875 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/a8a656b3-b809-44ee-bcc8-cfec2ae27ec7-proxy-ca-bundles\") pod \"controller-manager-7dc5f6d8dc-9cs6b\" (UID: \"a8a656b3-b809-44ee-bcc8-cfec2ae27ec7\") " pod="openshift-controller-manager/controller-manager-7dc5f6d8dc-9cs6b" Mar 16 00:11:15 crc kubenswrapper[4816]: I0316 00:11:15.509274 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a8a656b3-b809-44ee-bcc8-cfec2ae27ec7-config\") pod \"controller-manager-7dc5f6d8dc-9cs6b\" (UID: \"a8a656b3-b809-44ee-bcc8-cfec2ae27ec7\") " pod="openshift-controller-manager/controller-manager-7dc5f6d8dc-9cs6b" Mar 16 00:11:15 crc kubenswrapper[4816]: I0316 00:11:15.511049 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a8a656b3-b809-44ee-bcc8-cfec2ae27ec7-serving-cert\") pod \"controller-manager-7dc5f6d8dc-9cs6b\" (UID: \"a8a656b3-b809-44ee-bcc8-cfec2ae27ec7\") " pod="openshift-controller-manager/controller-manager-7dc5f6d8dc-9cs6b" Mar 16 00:11:15 crc kubenswrapper[4816]: I0316 00:11:15.520690 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hl55f\" (UniqueName: \"kubernetes.io/projected/a8a656b3-b809-44ee-bcc8-cfec2ae27ec7-kube-api-access-hl55f\") pod \"controller-manager-7dc5f6d8dc-9cs6b\" (UID: \"a8a656b3-b809-44ee-bcc8-cfec2ae27ec7\") " pod="openshift-controller-manager/controller-manager-7dc5f6d8dc-9cs6b" Mar 16 00:11:15 crc kubenswrapper[4816]: I0316 00:11:15.610025 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7dc5f6d8dc-9cs6b" Mar 16 00:11:15 crc kubenswrapper[4816]: I0316 00:11:15.707185 4816 patch_prober.go:28] interesting pod/route-controller-manager-6d597ffc5b-jhblv container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.55:8443/healthz\": dial tcp 10.217.0.55:8443: connect: connection refused" start-of-body= Mar 16 00:11:15 crc kubenswrapper[4816]: I0316 00:11:15.707249 4816 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6d597ffc5b-jhblv" podUID="93c061cd-ed29-4f9c-ad3a-0fb204ce6f8d" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.55:8443/healthz\": dial tcp 10.217.0.55:8443: connect: connection refused" Mar 16 00:11:15 crc kubenswrapper[4816]: I0316 00:11:15.844398 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-64d6fc58d9-vgh6t" event={"ID":"1feb17b5-7946-4727-a954-d516a9b8469b","Type":"ContainerDied","Data":"2b950732b2a5bf6036d818014b94cf2a7cdbaaa448fc7e9ce26ccb0e98f8f687"} Mar 16 00:11:15 crc kubenswrapper[4816]: I0316 00:11:15.844452 4816 scope.go:117] "RemoveContainer" containerID="ed184c9fb61316e6cf511969f5857732bc0cbe98f1fc984c31c588b0377ff308" Mar 16 00:11:15 crc kubenswrapper[4816]: I0316 00:11:15.845154 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-64d6fc58d9-vgh6t" Mar 16 00:11:15 crc kubenswrapper[4816]: I0316 00:11:15.868569 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-64d6fc58d9-vgh6t"] Mar 16 00:11:15 crc kubenswrapper[4816]: I0316 00:11:15.875524 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-64d6fc58d9-vgh6t"] Mar 16 00:11:17 crc kubenswrapper[4816]: I0316 00:11:17.001024 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Mar 16 00:11:17 crc kubenswrapper[4816]: I0316 00:11:17.001920 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 16 00:11:17 crc kubenswrapper[4816]: I0316 00:11:17.006372 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Mar 16 00:11:17 crc kubenswrapper[4816]: I0316 00:11:17.006496 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Mar 16 00:11:17 crc kubenswrapper[4816]: I0316 00:11:17.011685 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Mar 16 00:11:17 crc kubenswrapper[4816]: I0316 00:11:17.152898 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/2f10e41c-e6db-4083-bb86-ed0d39cc1a5a-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"2f10e41c-e6db-4083-bb86-ed0d39cc1a5a\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 16 00:11:17 crc kubenswrapper[4816]: I0316 00:11:17.152990 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/2f10e41c-e6db-4083-bb86-ed0d39cc1a5a-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"2f10e41c-e6db-4083-bb86-ed0d39cc1a5a\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 16 00:11:17 crc kubenswrapper[4816]: I0316 00:11:17.254817 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/2f10e41c-e6db-4083-bb86-ed0d39cc1a5a-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"2f10e41c-e6db-4083-bb86-ed0d39cc1a5a\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 16 00:11:17 crc kubenswrapper[4816]: I0316 00:11:17.254909 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/2f10e41c-e6db-4083-bb86-ed0d39cc1a5a-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"2f10e41c-e6db-4083-bb86-ed0d39cc1a5a\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 16 00:11:17 crc kubenswrapper[4816]: I0316 00:11:17.255001 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/2f10e41c-e6db-4083-bb86-ed0d39cc1a5a-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"2f10e41c-e6db-4083-bb86-ed0d39cc1a5a\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 16 00:11:17 crc kubenswrapper[4816]: I0316 00:11:17.271224 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/2f10e41c-e6db-4083-bb86-ed0d39cc1a5a-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"2f10e41c-e6db-4083-bb86-ed0d39cc1a5a\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 16 00:11:17 crc kubenswrapper[4816]: I0316 00:11:17.324958 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 16 00:11:17 crc kubenswrapper[4816]: I0316 00:11:17.678243 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1feb17b5-7946-4727-a954-d516a9b8469b" path="/var/lib/kubelet/pods/1feb17b5-7946-4727-a954-d516a9b8469b/volumes" Mar 16 00:11:21 crc kubenswrapper[4816]: I0316 00:11:21.908472 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-7dc5f6d8dc-9cs6b"] Mar 16 00:11:21 crc kubenswrapper[4816]: I0316 00:11:21.963885 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Mar 16 00:11:21 crc kubenswrapper[4816]: I0316 00:11:21.964806 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Mar 16 00:11:21 crc kubenswrapper[4816]: I0316 00:11:21.982913 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Mar 16 00:11:22 crc kubenswrapper[4816]: I0316 00:11:22.017743 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/a9416716-e666-46d6-9d77-fe5c9702c035-var-lock\") pod \"installer-9-crc\" (UID: \"a9416716-e666-46d6-9d77-fe5c9702c035\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 16 00:11:22 crc kubenswrapper[4816]: I0316 00:11:22.017843 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a9416716-e666-46d6-9d77-fe5c9702c035-kube-api-access\") pod \"installer-9-crc\" (UID: \"a9416716-e666-46d6-9d77-fe5c9702c035\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 16 00:11:22 crc kubenswrapper[4816]: I0316 00:11:22.017865 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/a9416716-e666-46d6-9d77-fe5c9702c035-kubelet-dir\") pod \"installer-9-crc\" (UID: \"a9416716-e666-46d6-9d77-fe5c9702c035\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 16 00:11:22 crc kubenswrapper[4816]: I0316 00:11:22.119288 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/a9416716-e666-46d6-9d77-fe5c9702c035-var-lock\") pod \"installer-9-crc\" (UID: \"a9416716-e666-46d6-9d77-fe5c9702c035\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 16 00:11:22 crc kubenswrapper[4816]: I0316 00:11:22.119380 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a9416716-e666-46d6-9d77-fe5c9702c035-kube-api-access\") pod \"installer-9-crc\" (UID: \"a9416716-e666-46d6-9d77-fe5c9702c035\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 16 00:11:22 crc kubenswrapper[4816]: I0316 00:11:22.119408 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/a9416716-e666-46d6-9d77-fe5c9702c035-kubelet-dir\") pod \"installer-9-crc\" (UID: \"a9416716-e666-46d6-9d77-fe5c9702c035\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 16 00:11:22 crc kubenswrapper[4816]: I0316 00:11:22.119416 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/a9416716-e666-46d6-9d77-fe5c9702c035-var-lock\") pod \"installer-9-crc\" (UID: \"a9416716-e666-46d6-9d77-fe5c9702c035\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 16 00:11:22 crc kubenswrapper[4816]: I0316 00:11:22.119468 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/a9416716-e666-46d6-9d77-fe5c9702c035-kubelet-dir\") pod \"installer-9-crc\" (UID: \"a9416716-e666-46d6-9d77-fe5c9702c035\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 16 00:11:22 crc kubenswrapper[4816]: I0316 00:11:22.136380 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a9416716-e666-46d6-9d77-fe5c9702c035-kube-api-access\") pod \"installer-9-crc\" (UID: \"a9416716-e666-46d6-9d77-fe5c9702c035\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 16 00:11:22 crc kubenswrapper[4816]: I0316 00:11:22.279475 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Mar 16 00:11:24 crc kubenswrapper[4816]: I0316 00:11:24.747496 4816 patch_prober.go:28] interesting pod/downloads-7954f5f757-5rr7c container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.14:8080/\": dial tcp 10.217.0.14:8080: connect: connection refused" start-of-body= Mar 16 00:11:24 crc kubenswrapper[4816]: I0316 00:11:24.747905 4816 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-5rr7c" podUID="0ec3cdc0-f024-43cf-b520-7d2437e0f8df" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.14:8080/\": dial tcp 10.217.0.14:8080: connect: connection refused" Mar 16 00:11:26 crc kubenswrapper[4816]: I0316 00:11:26.707510 4816 patch_prober.go:28] interesting pod/route-controller-manager-6d597ffc5b-jhblv container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.55:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 16 00:11:28 crc kubenswrapper[4816]: I0316 00:11:26.707617 4816 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6d597ffc5b-jhblv" podUID="93c061cd-ed29-4f9c-ad3a-0fb204ce6f8d" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.55:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 16 00:11:28 crc kubenswrapper[4816]: E0316 00:11:27.895891 4816 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Mar 16 00:11:28 crc kubenswrapper[4816]: E0316 00:11:27.896079 4816 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-8gwq2,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-8q2xw_openshift-marketplace(bf586c6e-f957-46fc-8140-f9a9ea22510f): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 16 00:11:28 crc kubenswrapper[4816]: E0316 00:11:27.897331 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-8q2xw" podUID="bf586c6e-f957-46fc-8140-f9a9ea22510f" Mar 16 00:11:31 crc kubenswrapper[4816]: I0316 00:11:31.863311 4816 patch_prober.go:28] interesting pod/machine-config-daemon-jrdcz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 16 00:11:31 crc kubenswrapper[4816]: I0316 00:11:31.863818 4816 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jrdcz" podUID="dd08ece2-7636-4966-973a-e96a34b70b53" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 16 00:11:32 crc kubenswrapper[4816]: E0316 00:11:32.843728 4816 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Mar 16 00:11:32 crc kubenswrapper[4816]: E0316 00:11:32.843903 4816 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-qk7j9,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-hvpqn_openshift-marketplace(cc1ea93d-1cf8-4145-ad35-83f2d1357f9d): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 16 00:11:32 crc kubenswrapper[4816]: E0316 00:11:32.845160 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-hvpqn" podUID="cc1ea93d-1cf8-4145-ad35-83f2d1357f9d" Mar 16 00:11:34 crc kubenswrapper[4816]: I0316 00:11:34.748540 4816 patch_prober.go:28] interesting pod/downloads-7954f5f757-5rr7c container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.14:8080/\": dial tcp 10.217.0.14:8080: connect: connection refused" start-of-body= Mar 16 00:11:34 crc kubenswrapper[4816]: I0316 00:11:34.749705 4816 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-5rr7c" podUID="0ec3cdc0-f024-43cf-b520-7d2437e0f8df" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.14:8080/\": dial tcp 10.217.0.14:8080: connect: connection refused" Mar 16 00:11:36 crc kubenswrapper[4816]: I0316 00:11:36.707469 4816 patch_prober.go:28] interesting pod/route-controller-manager-6d597ffc5b-jhblv container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.55:8443/healthz\": dial tcp 10.217.0.55:8443: i/o timeout" start-of-body= Mar 16 00:11:36 crc kubenswrapper[4816]: I0316 00:11:36.707817 4816 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6d597ffc5b-jhblv" podUID="93c061cd-ed29-4f9c-ad3a-0fb204ce6f8d" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.55:8443/healthz\": dial tcp 10.217.0.55:8443: i/o timeout" Mar 16 00:11:37 crc kubenswrapper[4816]: E0316 00:11:37.852354 4816 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Mar 16 00:11:37 crc kubenswrapper[4816]: E0316 00:11:37.852808 4816 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-w4mjs,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-wh2h7_openshift-marketplace(b1b3efd0-cdc0-4973-8077-bcd1ea567bdd): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 16 00:11:37 crc kubenswrapper[4816]: E0316 00:11:37.854365 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-wh2h7" podUID="b1b3efd0-cdc0-4973-8077-bcd1ea567bdd" Mar 16 00:11:39 crc kubenswrapper[4816]: E0316 00:11:39.385971 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-wh2h7" podUID="b1b3efd0-cdc0-4973-8077-bcd1ea567bdd" Mar 16 00:11:40 crc kubenswrapper[4816]: E0316 00:11:40.052943 4816 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Mar 16 00:11:40 crc kubenswrapper[4816]: E0316 00:11:40.053289 4816 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-45bbd,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-4gwcw_openshift-marketplace(ad80e1a9-75dc-4860-9bd9-d59b0c0ae43c): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 16 00:11:40 crc kubenswrapper[4816]: E0316 00:11:40.054441 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-4gwcw" podUID="ad80e1a9-75dc-4860-9bd9-d59b0c0ae43c" Mar 16 00:11:41 crc kubenswrapper[4816]: E0316 00:11:41.822687 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-4gwcw" podUID="ad80e1a9-75dc-4860-9bd9-d59b0c0ae43c" Mar 16 00:11:41 crc kubenswrapper[4816]: E0316 00:11:41.883993 4816 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Mar 16 00:11:41 crc kubenswrapper[4816]: E0316 00:11:41.884229 4816 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-mrpff,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-52qs6_openshift-marketplace(6ca6c2c9-3a12-4eb3-9df1-7fdea640791d): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 16 00:11:41 crc kubenswrapper[4816]: E0316 00:11:41.886090 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-52qs6" podUID="6ca6c2c9-3a12-4eb3-9df1-7fdea640791d" Mar 16 00:11:41 crc kubenswrapper[4816]: I0316 00:11:41.915109 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6d597ffc5b-jhblv" Mar 16 00:11:41 crc kubenswrapper[4816]: E0316 00:11:41.915914 4816 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Mar 16 00:11:41 crc kubenswrapper[4816]: E0316 00:11:41.916097 4816 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-l2d9d,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-6gbkl_openshift-marketplace(bad7b5f7-88a8-4c20-a010-734a46f59e05): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 16 00:11:41 crc kubenswrapper[4816]: E0316 00:11:41.918358 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-6gbkl" podUID="bad7b5f7-88a8-4c20-a010-734a46f59e05" Mar 16 00:11:41 crc kubenswrapper[4816]: I0316 00:11:41.950409 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-66c4db87dd-spzwx"] Mar 16 00:11:41 crc kubenswrapper[4816]: E0316 00:11:41.951799 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="93c061cd-ed29-4f9c-ad3a-0fb204ce6f8d" containerName="route-controller-manager" Mar 16 00:11:41 crc kubenswrapper[4816]: I0316 00:11:41.951826 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="93c061cd-ed29-4f9c-ad3a-0fb204ce6f8d" containerName="route-controller-manager" Mar 16 00:11:41 crc kubenswrapper[4816]: I0316 00:11:41.968411 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="93c061cd-ed29-4f9c-ad3a-0fb204ce6f8d" containerName="route-controller-manager" Mar 16 00:11:41 crc kubenswrapper[4816]: I0316 00:11:41.969343 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-66c4db87dd-spzwx" Mar 16 00:11:41 crc kubenswrapper[4816]: I0316 00:11:41.970311 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-66c4db87dd-spzwx"] Mar 16 00:11:42 crc kubenswrapper[4816]: I0316 00:11:42.016881 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6d597ffc5b-jhblv" event={"ID":"93c061cd-ed29-4f9c-ad3a-0fb204ce6f8d","Type":"ContainerDied","Data":"e26612b919b84db051d8d1f5b5f0b9a5f292e7d09dcad15803116b4fcd5c25d6"} Mar 16 00:11:42 crc kubenswrapper[4816]: I0316 00:11:42.016989 4816 scope.go:117] "RemoveContainer" containerID="8fdaf15dae2b09a126e743d43c57d752450921c814448bf980f67e094859a0df" Mar 16 00:11:42 crc kubenswrapper[4816]: I0316 00:11:42.017224 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6d597ffc5b-jhblv" Mar 16 00:11:42 crc kubenswrapper[4816]: E0316 00:11:42.020298 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-6gbkl" podUID="bad7b5f7-88a8-4c20-a010-734a46f59e05" Mar 16 00:11:42 crc kubenswrapper[4816]: E0316 00:11:42.020772 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-52qs6" podUID="6ca6c2c9-3a12-4eb3-9df1-7fdea640791d" Mar 16 00:11:42 crc kubenswrapper[4816]: I0316 00:11:42.037542 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/93c061cd-ed29-4f9c-ad3a-0fb204ce6f8d-serving-cert\") pod \"93c061cd-ed29-4f9c-ad3a-0fb204ce6f8d\" (UID: \"93c061cd-ed29-4f9c-ad3a-0fb204ce6f8d\") " Mar 16 00:11:42 crc kubenswrapper[4816]: I0316 00:11:42.037785 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/93c061cd-ed29-4f9c-ad3a-0fb204ce6f8d-client-ca\") pod \"93c061cd-ed29-4f9c-ad3a-0fb204ce6f8d\" (UID: \"93c061cd-ed29-4f9c-ad3a-0fb204ce6f8d\") " Mar 16 00:11:42 crc kubenswrapper[4816]: I0316 00:11:42.037862 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/93c061cd-ed29-4f9c-ad3a-0fb204ce6f8d-config\") pod \"93c061cd-ed29-4f9c-ad3a-0fb204ce6f8d\" (UID: \"93c061cd-ed29-4f9c-ad3a-0fb204ce6f8d\") " Mar 16 00:11:42 crc kubenswrapper[4816]: I0316 00:11:42.037928 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qt6qg\" (UniqueName: \"kubernetes.io/projected/93c061cd-ed29-4f9c-ad3a-0fb204ce6f8d-kube-api-access-qt6qg\") pod \"93c061cd-ed29-4f9c-ad3a-0fb204ce6f8d\" (UID: \"93c061cd-ed29-4f9c-ad3a-0fb204ce6f8d\") " Mar 16 00:11:42 crc kubenswrapper[4816]: I0316 00:11:42.046274 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/93c061cd-ed29-4f9c-ad3a-0fb204ce6f8d-config" (OuterVolumeSpecName: "config") pod "93c061cd-ed29-4f9c-ad3a-0fb204ce6f8d" (UID: "93c061cd-ed29-4f9c-ad3a-0fb204ce6f8d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:11:42 crc kubenswrapper[4816]: I0316 00:11:42.046266 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/93c061cd-ed29-4f9c-ad3a-0fb204ce6f8d-client-ca" (OuterVolumeSpecName: "client-ca") pod "93c061cd-ed29-4f9c-ad3a-0fb204ce6f8d" (UID: "93c061cd-ed29-4f9c-ad3a-0fb204ce6f8d"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:11:42 crc kubenswrapper[4816]: I0316 00:11:42.050172 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/93c061cd-ed29-4f9c-ad3a-0fb204ce6f8d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "93c061cd-ed29-4f9c-ad3a-0fb204ce6f8d" (UID: "93c061cd-ed29-4f9c-ad3a-0fb204ce6f8d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 16 00:11:42 crc kubenswrapper[4816]: I0316 00:11:42.050256 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/93c061cd-ed29-4f9c-ad3a-0fb204ce6f8d-kube-api-access-qt6qg" (OuterVolumeSpecName: "kube-api-access-qt6qg") pod "93c061cd-ed29-4f9c-ad3a-0fb204ce6f8d" (UID: "93c061cd-ed29-4f9c-ad3a-0fb204ce6f8d"). InnerVolumeSpecName "kube-api-access-qt6qg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:11:42 crc kubenswrapper[4816]: I0316 00:11:42.140545 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-snc4z\" (UniqueName: \"kubernetes.io/projected/d843d76b-9317-42aa-848b-e3e11c3106cb-kube-api-access-snc4z\") pod \"route-controller-manager-66c4db87dd-spzwx\" (UID: \"d843d76b-9317-42aa-848b-e3e11c3106cb\") " pod="openshift-route-controller-manager/route-controller-manager-66c4db87dd-spzwx" Mar 16 00:11:42 crc kubenswrapper[4816]: I0316 00:11:42.141156 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d843d76b-9317-42aa-848b-e3e11c3106cb-client-ca\") pod \"route-controller-manager-66c4db87dd-spzwx\" (UID: \"d843d76b-9317-42aa-848b-e3e11c3106cb\") " pod="openshift-route-controller-manager/route-controller-manager-66c4db87dd-spzwx" Mar 16 00:11:42 crc kubenswrapper[4816]: I0316 00:11:42.141189 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d843d76b-9317-42aa-848b-e3e11c3106cb-serving-cert\") pod \"route-controller-manager-66c4db87dd-spzwx\" (UID: \"d843d76b-9317-42aa-848b-e3e11c3106cb\") " pod="openshift-route-controller-manager/route-controller-manager-66c4db87dd-spzwx" Mar 16 00:11:42 crc kubenswrapper[4816]: I0316 00:11:42.141216 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d843d76b-9317-42aa-848b-e3e11c3106cb-config\") pod \"route-controller-manager-66c4db87dd-spzwx\" (UID: \"d843d76b-9317-42aa-848b-e3e11c3106cb\") " pod="openshift-route-controller-manager/route-controller-manager-66c4db87dd-spzwx" Mar 16 00:11:42 crc kubenswrapper[4816]: I0316 00:11:42.141301 4816 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/93c061cd-ed29-4f9c-ad3a-0fb204ce6f8d-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 16 00:11:42 crc kubenswrapper[4816]: I0316 00:11:42.141317 4816 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/93c061cd-ed29-4f9c-ad3a-0fb204ce6f8d-client-ca\") on node \"crc\" DevicePath \"\"" Mar 16 00:11:42 crc kubenswrapper[4816]: I0316 00:11:42.141327 4816 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/93c061cd-ed29-4f9c-ad3a-0fb204ce6f8d-config\") on node \"crc\" DevicePath \"\"" Mar 16 00:11:42 crc kubenswrapper[4816]: I0316 00:11:42.141338 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qt6qg\" (UniqueName: \"kubernetes.io/projected/93c061cd-ed29-4f9c-ad3a-0fb204ce6f8d-kube-api-access-qt6qg\") on node \"crc\" DevicePath \"\"" Mar 16 00:11:42 crc kubenswrapper[4816]: I0316 00:11:42.201415 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-7dc5f6d8dc-9cs6b"] Mar 16 00:11:42 crc kubenswrapper[4816]: W0316 00:11:42.207634 4816 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda8a656b3_b809_44ee_bcc8_cfec2ae27ec7.slice/crio-29366548a9919c43dc76196eb98a6cdaf85a57ba7ece5ccd1f7de91db89bc729 WatchSource:0}: Error finding container 29366548a9919c43dc76196eb98a6cdaf85a57ba7ece5ccd1f7de91db89bc729: Status 404 returned error can't find the container with id 29366548a9919c43dc76196eb98a6cdaf85a57ba7ece5ccd1f7de91db89bc729 Mar 16 00:11:42 crc kubenswrapper[4816]: I0316 00:11:42.243172 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d843d76b-9317-42aa-848b-e3e11c3106cb-client-ca\") pod \"route-controller-manager-66c4db87dd-spzwx\" (UID: \"d843d76b-9317-42aa-848b-e3e11c3106cb\") " pod="openshift-route-controller-manager/route-controller-manager-66c4db87dd-spzwx" Mar 16 00:11:42 crc kubenswrapper[4816]: I0316 00:11:42.243218 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d843d76b-9317-42aa-848b-e3e11c3106cb-serving-cert\") pod \"route-controller-manager-66c4db87dd-spzwx\" (UID: \"d843d76b-9317-42aa-848b-e3e11c3106cb\") " pod="openshift-route-controller-manager/route-controller-manager-66c4db87dd-spzwx" Mar 16 00:11:42 crc kubenswrapper[4816]: I0316 00:11:42.243252 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d843d76b-9317-42aa-848b-e3e11c3106cb-config\") pod \"route-controller-manager-66c4db87dd-spzwx\" (UID: \"d843d76b-9317-42aa-848b-e3e11c3106cb\") " pod="openshift-route-controller-manager/route-controller-manager-66c4db87dd-spzwx" Mar 16 00:11:42 crc kubenswrapper[4816]: I0316 00:11:42.243315 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-snc4z\" (UniqueName: \"kubernetes.io/projected/d843d76b-9317-42aa-848b-e3e11c3106cb-kube-api-access-snc4z\") pod \"route-controller-manager-66c4db87dd-spzwx\" (UID: \"d843d76b-9317-42aa-848b-e3e11c3106cb\") " pod="openshift-route-controller-manager/route-controller-manager-66c4db87dd-spzwx" Mar 16 00:11:42 crc kubenswrapper[4816]: I0316 00:11:42.244863 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d843d76b-9317-42aa-848b-e3e11c3106cb-client-ca\") pod \"route-controller-manager-66c4db87dd-spzwx\" (UID: \"d843d76b-9317-42aa-848b-e3e11c3106cb\") " pod="openshift-route-controller-manager/route-controller-manager-66c4db87dd-spzwx" Mar 16 00:11:42 crc kubenswrapper[4816]: I0316 00:11:42.245296 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d843d76b-9317-42aa-848b-e3e11c3106cb-config\") pod \"route-controller-manager-66c4db87dd-spzwx\" (UID: \"d843d76b-9317-42aa-848b-e3e11c3106cb\") " pod="openshift-route-controller-manager/route-controller-manager-66c4db87dd-spzwx" Mar 16 00:11:42 crc kubenswrapper[4816]: I0316 00:11:42.252198 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d843d76b-9317-42aa-848b-e3e11c3106cb-serving-cert\") pod \"route-controller-manager-66c4db87dd-spzwx\" (UID: \"d843d76b-9317-42aa-848b-e3e11c3106cb\") " pod="openshift-route-controller-manager/route-controller-manager-66c4db87dd-spzwx" Mar 16 00:11:42 crc kubenswrapper[4816]: I0316 00:11:42.263821 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-snc4z\" (UniqueName: \"kubernetes.io/projected/d843d76b-9317-42aa-848b-e3e11c3106cb-kube-api-access-snc4z\") pod \"route-controller-manager-66c4db87dd-spzwx\" (UID: \"d843d76b-9317-42aa-848b-e3e11c3106cb\") " pod="openshift-route-controller-manager/route-controller-manager-66c4db87dd-spzwx" Mar 16 00:11:42 crc kubenswrapper[4816]: I0316 00:11:42.313013 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-66c4db87dd-spzwx" Mar 16 00:11:42 crc kubenswrapper[4816]: I0316 00:11:42.345598 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6d597ffc5b-jhblv"] Mar 16 00:11:42 crc kubenswrapper[4816]: I0316 00:11:42.348896 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6d597ffc5b-jhblv"] Mar 16 00:11:42 crc kubenswrapper[4816]: E0316 00:11:42.387828 4816 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Mar 16 00:11:42 crc kubenswrapper[4816]: E0316 00:11:42.387982 4816 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-j24xj,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-7pb49_openshift-marketplace(a5ba22dd-8e8e-4beb-a540-e5c9687810b8): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 16 00:11:42 crc kubenswrapper[4816]: E0316 00:11:42.389190 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-7pb49" podUID="a5ba22dd-8e8e-4beb-a540-e5c9687810b8" Mar 16 00:11:42 crc kubenswrapper[4816]: I0316 00:11:42.473489 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Mar 16 00:11:42 crc kubenswrapper[4816]: I0316 00:11:42.485197 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Mar 16 00:11:42 crc kubenswrapper[4816]: I0316 00:11:42.541531 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-66c4db87dd-spzwx"] Mar 16 00:11:43 crc kubenswrapper[4816]: I0316 00:11:43.022463 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7dc5f6d8dc-9cs6b" event={"ID":"a8a656b3-b809-44ee-bcc8-cfec2ae27ec7","Type":"ContainerStarted","Data":"1622414dc19bed94547791fb46aea5e67b087d0109b950fdff872fc6af3fe300"} Mar 16 00:11:43 crc kubenswrapper[4816]: I0316 00:11:43.022826 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-7dc5f6d8dc-9cs6b" Mar 16 00:11:43 crc kubenswrapper[4816]: I0316 00:11:43.022838 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7dc5f6d8dc-9cs6b" event={"ID":"a8a656b3-b809-44ee-bcc8-cfec2ae27ec7","Type":"ContainerStarted","Data":"29366548a9919c43dc76196eb98a6cdaf85a57ba7ece5ccd1f7de91db89bc729"} Mar 16 00:11:43 crc kubenswrapper[4816]: I0316 00:11:43.022544 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-7dc5f6d8dc-9cs6b" podUID="a8a656b3-b809-44ee-bcc8-cfec2ae27ec7" containerName="controller-manager" containerID="cri-o://1622414dc19bed94547791fb46aea5e67b087d0109b950fdff872fc6af3fe300" gracePeriod=30 Mar 16 00:11:43 crc kubenswrapper[4816]: I0316 00:11:43.025180 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-66c4db87dd-spzwx" event={"ID":"d843d76b-9317-42aa-848b-e3e11c3106cb","Type":"ContainerStarted","Data":"f342289f8f13f5d89f00dac92a6b213282ffa583b6c1a48b772dae90dc55fd82"} Mar 16 00:11:43 crc kubenswrapper[4816]: I0316 00:11:43.026109 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"2f10e41c-e6db-4083-bb86-ed0d39cc1a5a","Type":"ContainerStarted","Data":"934833059dddc073b4415862e96aafb4ed1091c7c9cabca244d231fa20d34d92"} Mar 16 00:11:43 crc kubenswrapper[4816]: I0316 00:11:43.027758 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"a9416716-e666-46d6-9d77-fe5c9702c035","Type":"ContainerStarted","Data":"dd148857a3f8f3853eb8381f642acb80c9aad6dc4ab5491e0ecfe89f172f60d6"} Mar 16 00:11:43 crc kubenswrapper[4816]: I0316 00:11:43.033306 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-5rr7c" event={"ID":"0ec3cdc0-f024-43cf-b520-7d2437e0f8df","Type":"ContainerStarted","Data":"61c9642fe76a811268f1e4f78cdfc2538ce75e905554ea5575dfd07e151f6573"} Mar 16 00:11:43 crc kubenswrapper[4816]: I0316 00:11:43.034935 4816 patch_prober.go:28] interesting pod/downloads-7954f5f757-5rr7c container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.14:8080/\": dial tcp 10.217.0.14:8080: connect: connection refused" start-of-body= Mar 16 00:11:43 crc kubenswrapper[4816]: I0316 00:11:43.034972 4816 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-5rr7c" podUID="0ec3cdc0-f024-43cf-b520-7d2437e0f8df" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.14:8080/\": dial tcp 10.217.0.14:8080: connect: connection refused" Mar 16 00:11:43 crc kubenswrapper[4816]: E0316 00:11:43.035904 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-7pb49" podUID="a5ba22dd-8e8e-4beb-a540-e5c9687810b8" Mar 16 00:11:43 crc kubenswrapper[4816]: I0316 00:11:43.061691 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-7dc5f6d8dc-9cs6b" podStartSLOduration=42.061674202 podStartE2EDuration="42.061674202s" podCreationTimestamp="2026-03-16 00:11:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-16 00:11:43.041486147 +0000 UTC m=+296.137786100" watchObservedRunningTime="2026-03-16 00:11:43.061674202 +0000 UTC m=+296.157974155" Mar 16 00:11:43 crc kubenswrapper[4816]: I0316 00:11:43.094799 4816 patch_prober.go:28] interesting pod/controller-manager-7dc5f6d8dc-9cs6b container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.58:8443/healthz\": EOF" start-of-body= Mar 16 00:11:43 crc kubenswrapper[4816]: I0316 00:11:43.094856 4816 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-7dc5f6d8dc-9cs6b" podUID="a8a656b3-b809-44ee-bcc8-cfec2ae27ec7" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.58:8443/healthz\": EOF" Mar 16 00:11:43 crc kubenswrapper[4816]: I0316 00:11:43.672918 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="93c061cd-ed29-4f9c-ad3a-0fb204ce6f8d" path="/var/lib/kubelet/pods/93c061cd-ed29-4f9c-ad3a-0fb204ce6f8d/volumes" Mar 16 00:11:44 crc kubenswrapper[4816]: I0316 00:11:44.040614 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-66c4db87dd-spzwx" event={"ID":"d843d76b-9317-42aa-848b-e3e11c3106cb","Type":"ContainerStarted","Data":"5e88284adb127d841afb88a27a9bd12da2e089c18ae8c1f363357186ff912634"} Mar 16 00:11:44 crc kubenswrapper[4816]: I0316 00:11:44.043350 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"2f10e41c-e6db-4083-bb86-ed0d39cc1a5a","Type":"ContainerStarted","Data":"29ea047cfa535477d88409add4c285e480d3dd9e6f79bea9d43c76200c1b38cb"} Mar 16 00:11:44 crc kubenswrapper[4816]: I0316 00:11:44.043843 4816 patch_prober.go:28] interesting pod/downloads-7954f5f757-5rr7c container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.14:8080/\": dial tcp 10.217.0.14:8080: connect: connection refused" start-of-body= Mar 16 00:11:44 crc kubenswrapper[4816]: I0316 00:11:44.043911 4816 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-5rr7c" podUID="0ec3cdc0-f024-43cf-b520-7d2437e0f8df" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.14:8080/\": dial tcp 10.217.0.14:8080: connect: connection refused" Mar 16 00:11:44 crc kubenswrapper[4816]: I0316 00:11:44.044192 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/downloads-7954f5f757-5rr7c" Mar 16 00:11:44 crc kubenswrapper[4816]: I0316 00:11:44.747989 4816 patch_prober.go:28] interesting pod/downloads-7954f5f757-5rr7c container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.14:8080/\": dial tcp 10.217.0.14:8080: connect: connection refused" start-of-body= Mar 16 00:11:44 crc kubenswrapper[4816]: I0316 00:11:44.748083 4816 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-5rr7c" podUID="0ec3cdc0-f024-43cf-b520-7d2437e0f8df" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.14:8080/\": dial tcp 10.217.0.14:8080: connect: connection refused" Mar 16 00:11:44 crc kubenswrapper[4816]: I0316 00:11:44.747989 4816 patch_prober.go:28] interesting pod/downloads-7954f5f757-5rr7c container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.14:8080/\": dial tcp 10.217.0.14:8080: connect: connection refused" start-of-body= Mar 16 00:11:44 crc kubenswrapper[4816]: I0316 00:11:44.748322 4816 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-5rr7c" podUID="0ec3cdc0-f024-43cf-b520-7d2437e0f8df" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.14:8080/\": dial tcp 10.217.0.14:8080: connect: connection refused" Mar 16 00:11:44 crc kubenswrapper[4816]: E0316 00:11:44.835306 4816 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Mar 16 00:11:44 crc kubenswrapper[4816]: E0316 00:11:44.835494 4816 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-p8frh,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-bkxpc_openshift-marketplace(ff69863d-13e1-444c-ba61-6d68a509a203): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 16 00:11:44 crc kubenswrapper[4816]: E0316 00:11:44.837002 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-bkxpc" podUID="ff69863d-13e1-444c-ba61-6d68a509a203" Mar 16 00:11:45 crc kubenswrapper[4816]: I0316 00:11:45.050532 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"a9416716-e666-46d6-9d77-fe5c9702c035","Type":"ContainerStarted","Data":"dd1bd79082698289c067776233601bb17f6ba8cbb98dcb745b3329e5f4f6fb1f"} Mar 16 00:11:45 crc kubenswrapper[4816]: I0316 00:11:45.052684 4816 generic.go:334] "Generic (PLEG): container finished" podID="a8a656b3-b809-44ee-bcc8-cfec2ae27ec7" containerID="1622414dc19bed94547791fb46aea5e67b087d0109b950fdff872fc6af3fe300" exitCode=0 Mar 16 00:11:45 crc kubenswrapper[4816]: I0316 00:11:45.052793 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7dc5f6d8dc-9cs6b" event={"ID":"a8a656b3-b809-44ee-bcc8-cfec2ae27ec7","Type":"ContainerDied","Data":"1622414dc19bed94547791fb46aea5e67b087d0109b950fdff872fc6af3fe300"} Mar 16 00:11:45 crc kubenswrapper[4816]: I0316 00:11:45.053262 4816 patch_prober.go:28] interesting pod/downloads-7954f5f757-5rr7c container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.14:8080/\": dial tcp 10.217.0.14:8080: connect: connection refused" start-of-body= Mar 16 00:11:45 crc kubenswrapper[4816]: I0316 00:11:45.053326 4816 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-5rr7c" podUID="0ec3cdc0-f024-43cf-b520-7d2437e0f8df" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.14:8080/\": dial tcp 10.217.0.14:8080: connect: connection refused" Mar 16 00:11:45 crc kubenswrapper[4816]: E0316 00:11:45.054585 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-bkxpc" podUID="ff69863d-13e1-444c-ba61-6d68a509a203" Mar 16 00:11:45 crc kubenswrapper[4816]: I0316 00:11:45.080731 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-66c4db87dd-spzwx" podStartSLOduration=23.080714836 podStartE2EDuration="23.080714836s" podCreationTimestamp="2026-03-16 00:11:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-16 00:11:45.079044941 +0000 UTC m=+298.175344924" watchObservedRunningTime="2026-03-16 00:11:45.080714836 +0000 UTC m=+298.177014789" Mar 16 00:11:45 crc kubenswrapper[4816]: I0316 00:11:45.147783 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/revision-pruner-9-crc" podStartSLOduration=29.14776301 podStartE2EDuration="29.14776301s" podCreationTimestamp="2026-03-16 00:11:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-16 00:11:45.141470557 +0000 UTC m=+298.237770520" watchObservedRunningTime="2026-03-16 00:11:45.14776301 +0000 UTC m=+298.244062983" Mar 16 00:11:45 crc kubenswrapper[4816]: I0316 00:11:45.611162 4816 patch_prober.go:28] interesting pod/controller-manager-7dc5f6d8dc-9cs6b container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.58:8443/healthz\": dial tcp 10.217.0.58:8443: connect: connection refused" start-of-body= Mar 16 00:11:45 crc kubenswrapper[4816]: I0316 00:11:45.611232 4816 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-7dc5f6d8dc-9cs6b" podUID="a8a656b3-b809-44ee-bcc8-cfec2ae27ec7" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.58:8443/healthz\": dial tcp 10.217.0.58:8443: connect: connection refused" Mar 16 00:11:46 crc kubenswrapper[4816]: I0316 00:11:46.060212 4816 generic.go:334] "Generic (PLEG): container finished" podID="2f10e41c-e6db-4083-bb86-ed0d39cc1a5a" containerID="29ea047cfa535477d88409add4c285e480d3dd9e6f79bea9d43c76200c1b38cb" exitCode=0 Mar 16 00:11:46 crc kubenswrapper[4816]: I0316 00:11:46.060308 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"2f10e41c-e6db-4083-bb86-ed0d39cc1a5a","Type":"ContainerDied","Data":"29ea047cfa535477d88409add4c285e480d3dd9e6f79bea9d43c76200c1b38cb"} Mar 16 00:11:46 crc kubenswrapper[4816]: I0316 00:11:46.107395 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/installer-9-crc" podStartSLOduration=25.107377966 podStartE2EDuration="25.107377966s" podCreationTimestamp="2026-03-16 00:11:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-16 00:11:46.104713453 +0000 UTC m=+299.201013416" watchObservedRunningTime="2026-03-16 00:11:46.107377966 +0000 UTC m=+299.203677919" Mar 16 00:11:52 crc kubenswrapper[4816]: I0316 00:11:52.313720 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-66c4db87dd-spzwx" Mar 16 00:11:52 crc kubenswrapper[4816]: I0316 00:11:52.322434 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-66c4db87dd-spzwx" Mar 16 00:11:54 crc kubenswrapper[4816]: I0316 00:11:54.747984 4816 patch_prober.go:28] interesting pod/downloads-7954f5f757-5rr7c container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.14:8080/\": dial tcp 10.217.0.14:8080: connect: connection refused" start-of-body= Mar 16 00:11:54 crc kubenswrapper[4816]: I0316 00:11:54.748809 4816 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-5rr7c" podUID="0ec3cdc0-f024-43cf-b520-7d2437e0f8df" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.14:8080/\": dial tcp 10.217.0.14:8080: connect: connection refused" Mar 16 00:11:54 crc kubenswrapper[4816]: I0316 00:11:54.748042 4816 patch_prober.go:28] interesting pod/downloads-7954f5f757-5rr7c container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.14:8080/\": dial tcp 10.217.0.14:8080: connect: connection refused" start-of-body= Mar 16 00:11:54 crc kubenswrapper[4816]: I0316 00:11:54.748946 4816 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-5rr7c" podUID="0ec3cdc0-f024-43cf-b520-7d2437e0f8df" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.14:8080/\": dial tcp 10.217.0.14:8080: connect: connection refused" Mar 16 00:11:56 crc kubenswrapper[4816]: I0316 00:11:56.610777 4816 patch_prober.go:28] interesting pod/controller-manager-7dc5f6d8dc-9cs6b container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.58:8443/healthz\": dial tcp 10.217.0.58:8443: i/o timeout" start-of-body= Mar 16 00:11:56 crc kubenswrapper[4816]: I0316 00:11:56.610880 4816 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-7dc5f6d8dc-9cs6b" podUID="a8a656b3-b809-44ee-bcc8-cfec2ae27ec7" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.58:8443/healthz\": dial tcp 10.217.0.58:8443: i/o timeout" Mar 16 00:12:00 crc kubenswrapper[4816]: I0316 00:12:00.133520 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29560332-wb8kg"] Mar 16 00:12:00 crc kubenswrapper[4816]: I0316 00:12:00.134889 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29560332-wb8kg" Mar 16 00:12:00 crc kubenswrapper[4816]: I0316 00:12:00.135217 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29560332-wb8kg"] Mar 16 00:12:00 crc kubenswrapper[4816]: I0316 00:12:00.138103 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 16 00:12:00 crc kubenswrapper[4816]: I0316 00:12:00.138582 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 16 00:12:00 crc kubenswrapper[4816]: I0316 00:12:00.142364 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-8hc2r" Mar 16 00:12:00 crc kubenswrapper[4816]: I0316 00:12:00.142858 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-55wrt\" (UniqueName: \"kubernetes.io/projected/e570fb38-3e4c-4b9b-82d9-878ec6a5306f-kube-api-access-55wrt\") pod \"auto-csr-approver-29560332-wb8kg\" (UID: \"e570fb38-3e4c-4b9b-82d9-878ec6a5306f\") " pod="openshift-infra/auto-csr-approver-29560332-wb8kg" Mar 16 00:12:00 crc kubenswrapper[4816]: I0316 00:12:00.243301 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-55wrt\" (UniqueName: \"kubernetes.io/projected/e570fb38-3e4c-4b9b-82d9-878ec6a5306f-kube-api-access-55wrt\") pod \"auto-csr-approver-29560332-wb8kg\" (UID: \"e570fb38-3e4c-4b9b-82d9-878ec6a5306f\") " pod="openshift-infra/auto-csr-approver-29560332-wb8kg" Mar 16 00:12:00 crc kubenswrapper[4816]: I0316 00:12:00.261790 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-55wrt\" (UniqueName: \"kubernetes.io/projected/e570fb38-3e4c-4b9b-82d9-878ec6a5306f-kube-api-access-55wrt\") pod \"auto-csr-approver-29560332-wb8kg\" (UID: \"e570fb38-3e4c-4b9b-82d9-878ec6a5306f\") " pod="openshift-infra/auto-csr-approver-29560332-wb8kg" Mar 16 00:12:00 crc kubenswrapper[4816]: I0316 00:12:00.321322 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7dc5f6d8dc-9cs6b" Mar 16 00:12:00 crc kubenswrapper[4816]: I0316 00:12:00.362604 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-7fd798dd88-7v9zs"] Mar 16 00:12:00 crc kubenswrapper[4816]: E0316 00:12:00.363288 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a8a656b3-b809-44ee-bcc8-cfec2ae27ec7" containerName="controller-manager" Mar 16 00:12:00 crc kubenswrapper[4816]: I0316 00:12:00.363313 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="a8a656b3-b809-44ee-bcc8-cfec2ae27ec7" containerName="controller-manager" Mar 16 00:12:00 crc kubenswrapper[4816]: I0316 00:12:00.363622 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="a8a656b3-b809-44ee-bcc8-cfec2ae27ec7" containerName="controller-manager" Mar 16 00:12:00 crc kubenswrapper[4816]: I0316 00:12:00.366018 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7fd798dd88-7v9zs" Mar 16 00:12:00 crc kubenswrapper[4816]: I0316 00:12:00.369881 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-7fd798dd88-7v9zs"] Mar 16 00:12:00 crc kubenswrapper[4816]: I0316 00:12:00.450662 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a8a656b3-b809-44ee-bcc8-cfec2ae27ec7-config\") pod \"a8a656b3-b809-44ee-bcc8-cfec2ae27ec7\" (UID: \"a8a656b3-b809-44ee-bcc8-cfec2ae27ec7\") " Mar 16 00:12:00 crc kubenswrapper[4816]: I0316 00:12:00.450848 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a8a656b3-b809-44ee-bcc8-cfec2ae27ec7-client-ca\") pod \"a8a656b3-b809-44ee-bcc8-cfec2ae27ec7\" (UID: \"a8a656b3-b809-44ee-bcc8-cfec2ae27ec7\") " Mar 16 00:12:00 crc kubenswrapper[4816]: I0316 00:12:00.450927 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a8a656b3-b809-44ee-bcc8-cfec2ae27ec7-serving-cert\") pod \"a8a656b3-b809-44ee-bcc8-cfec2ae27ec7\" (UID: \"a8a656b3-b809-44ee-bcc8-cfec2ae27ec7\") " Mar 16 00:12:00 crc kubenswrapper[4816]: I0316 00:12:00.450986 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/a8a656b3-b809-44ee-bcc8-cfec2ae27ec7-proxy-ca-bundles\") pod \"a8a656b3-b809-44ee-bcc8-cfec2ae27ec7\" (UID: \"a8a656b3-b809-44ee-bcc8-cfec2ae27ec7\") " Mar 16 00:12:00 crc kubenswrapper[4816]: I0316 00:12:00.451021 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hl55f\" (UniqueName: \"kubernetes.io/projected/a8a656b3-b809-44ee-bcc8-cfec2ae27ec7-kube-api-access-hl55f\") pod \"a8a656b3-b809-44ee-bcc8-cfec2ae27ec7\" (UID: \"a8a656b3-b809-44ee-bcc8-cfec2ae27ec7\") " Mar 16 00:12:00 crc kubenswrapper[4816]: I0316 00:12:00.451838 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a8a656b3-b809-44ee-bcc8-cfec2ae27ec7-config" (OuterVolumeSpecName: "config") pod "a8a656b3-b809-44ee-bcc8-cfec2ae27ec7" (UID: "a8a656b3-b809-44ee-bcc8-cfec2ae27ec7"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:12:00 crc kubenswrapper[4816]: I0316 00:12:00.452229 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a8a656b3-b809-44ee-bcc8-cfec2ae27ec7-client-ca" (OuterVolumeSpecName: "client-ca") pod "a8a656b3-b809-44ee-bcc8-cfec2ae27ec7" (UID: "a8a656b3-b809-44ee-bcc8-cfec2ae27ec7"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:12:00 crc kubenswrapper[4816]: I0316 00:12:00.452249 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a8a656b3-b809-44ee-bcc8-cfec2ae27ec7-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "a8a656b3-b809-44ee-bcc8-cfec2ae27ec7" (UID: "a8a656b3-b809-44ee-bcc8-cfec2ae27ec7"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:12:00 crc kubenswrapper[4816]: I0316 00:12:00.456538 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29560332-wb8kg" Mar 16 00:12:00 crc kubenswrapper[4816]: I0316 00:12:00.463927 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a8a656b3-b809-44ee-bcc8-cfec2ae27ec7-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "a8a656b3-b809-44ee-bcc8-cfec2ae27ec7" (UID: "a8a656b3-b809-44ee-bcc8-cfec2ae27ec7"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 16 00:12:00 crc kubenswrapper[4816]: I0316 00:12:00.466786 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a8a656b3-b809-44ee-bcc8-cfec2ae27ec7-kube-api-access-hl55f" (OuterVolumeSpecName: "kube-api-access-hl55f") pod "a8a656b3-b809-44ee-bcc8-cfec2ae27ec7" (UID: "a8a656b3-b809-44ee-bcc8-cfec2ae27ec7"). InnerVolumeSpecName "kube-api-access-hl55f". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:12:00 crc kubenswrapper[4816]: I0316 00:12:00.552914 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/c682ba54-60b9-4293-ba42-dbde80524daf-proxy-ca-bundles\") pod \"controller-manager-7fd798dd88-7v9zs\" (UID: \"c682ba54-60b9-4293-ba42-dbde80524daf\") " pod="openshift-controller-manager/controller-manager-7fd798dd88-7v9zs" Mar 16 00:12:00 crc kubenswrapper[4816]: I0316 00:12:00.552986 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z8d5j\" (UniqueName: \"kubernetes.io/projected/c682ba54-60b9-4293-ba42-dbde80524daf-kube-api-access-z8d5j\") pod \"controller-manager-7fd798dd88-7v9zs\" (UID: \"c682ba54-60b9-4293-ba42-dbde80524daf\") " pod="openshift-controller-manager/controller-manager-7fd798dd88-7v9zs" Mar 16 00:12:00 crc kubenswrapper[4816]: I0316 00:12:00.553063 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c682ba54-60b9-4293-ba42-dbde80524daf-serving-cert\") pod \"controller-manager-7fd798dd88-7v9zs\" (UID: \"c682ba54-60b9-4293-ba42-dbde80524daf\") " pod="openshift-controller-manager/controller-manager-7fd798dd88-7v9zs" Mar 16 00:12:00 crc kubenswrapper[4816]: I0316 00:12:00.553110 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c682ba54-60b9-4293-ba42-dbde80524daf-config\") pod \"controller-manager-7fd798dd88-7v9zs\" (UID: \"c682ba54-60b9-4293-ba42-dbde80524daf\") " pod="openshift-controller-manager/controller-manager-7fd798dd88-7v9zs" Mar 16 00:12:00 crc kubenswrapper[4816]: I0316 00:12:00.553325 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c682ba54-60b9-4293-ba42-dbde80524daf-client-ca\") pod \"controller-manager-7fd798dd88-7v9zs\" (UID: \"c682ba54-60b9-4293-ba42-dbde80524daf\") " pod="openshift-controller-manager/controller-manager-7fd798dd88-7v9zs" Mar 16 00:12:00 crc kubenswrapper[4816]: I0316 00:12:00.553427 4816 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a8a656b3-b809-44ee-bcc8-cfec2ae27ec7-client-ca\") on node \"crc\" DevicePath \"\"" Mar 16 00:12:00 crc kubenswrapper[4816]: I0316 00:12:00.553447 4816 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a8a656b3-b809-44ee-bcc8-cfec2ae27ec7-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 16 00:12:00 crc kubenswrapper[4816]: I0316 00:12:00.553460 4816 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/a8a656b3-b809-44ee-bcc8-cfec2ae27ec7-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 16 00:12:00 crc kubenswrapper[4816]: I0316 00:12:00.553474 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hl55f\" (UniqueName: \"kubernetes.io/projected/a8a656b3-b809-44ee-bcc8-cfec2ae27ec7-kube-api-access-hl55f\") on node \"crc\" DevicePath \"\"" Mar 16 00:12:00 crc kubenswrapper[4816]: I0316 00:12:00.553483 4816 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a8a656b3-b809-44ee-bcc8-cfec2ae27ec7-config\") on node \"crc\" DevicePath \"\"" Mar 16 00:12:00 crc kubenswrapper[4816]: I0316 00:12:00.654493 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z8d5j\" (UniqueName: \"kubernetes.io/projected/c682ba54-60b9-4293-ba42-dbde80524daf-kube-api-access-z8d5j\") pod \"controller-manager-7fd798dd88-7v9zs\" (UID: \"c682ba54-60b9-4293-ba42-dbde80524daf\") " pod="openshift-controller-manager/controller-manager-7fd798dd88-7v9zs" Mar 16 00:12:00 crc kubenswrapper[4816]: I0316 00:12:00.654596 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c682ba54-60b9-4293-ba42-dbde80524daf-config\") pod \"controller-manager-7fd798dd88-7v9zs\" (UID: \"c682ba54-60b9-4293-ba42-dbde80524daf\") " pod="openshift-controller-manager/controller-manager-7fd798dd88-7v9zs" Mar 16 00:12:00 crc kubenswrapper[4816]: I0316 00:12:00.654642 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c682ba54-60b9-4293-ba42-dbde80524daf-serving-cert\") pod \"controller-manager-7fd798dd88-7v9zs\" (UID: \"c682ba54-60b9-4293-ba42-dbde80524daf\") " pod="openshift-controller-manager/controller-manager-7fd798dd88-7v9zs" Mar 16 00:12:00 crc kubenswrapper[4816]: I0316 00:12:00.654660 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c682ba54-60b9-4293-ba42-dbde80524daf-client-ca\") pod \"controller-manager-7fd798dd88-7v9zs\" (UID: \"c682ba54-60b9-4293-ba42-dbde80524daf\") " pod="openshift-controller-manager/controller-manager-7fd798dd88-7v9zs" Mar 16 00:12:00 crc kubenswrapper[4816]: I0316 00:12:00.654762 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/c682ba54-60b9-4293-ba42-dbde80524daf-proxy-ca-bundles\") pod \"controller-manager-7fd798dd88-7v9zs\" (UID: \"c682ba54-60b9-4293-ba42-dbde80524daf\") " pod="openshift-controller-manager/controller-manager-7fd798dd88-7v9zs" Mar 16 00:12:00 crc kubenswrapper[4816]: I0316 00:12:00.657259 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c682ba54-60b9-4293-ba42-dbde80524daf-client-ca\") pod \"controller-manager-7fd798dd88-7v9zs\" (UID: \"c682ba54-60b9-4293-ba42-dbde80524daf\") " pod="openshift-controller-manager/controller-manager-7fd798dd88-7v9zs" Mar 16 00:12:00 crc kubenswrapper[4816]: I0316 00:12:00.657309 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c682ba54-60b9-4293-ba42-dbde80524daf-config\") pod \"controller-manager-7fd798dd88-7v9zs\" (UID: \"c682ba54-60b9-4293-ba42-dbde80524daf\") " pod="openshift-controller-manager/controller-manager-7fd798dd88-7v9zs" Mar 16 00:12:00 crc kubenswrapper[4816]: I0316 00:12:00.657619 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/c682ba54-60b9-4293-ba42-dbde80524daf-proxy-ca-bundles\") pod \"controller-manager-7fd798dd88-7v9zs\" (UID: \"c682ba54-60b9-4293-ba42-dbde80524daf\") " pod="openshift-controller-manager/controller-manager-7fd798dd88-7v9zs" Mar 16 00:12:00 crc kubenswrapper[4816]: I0316 00:12:00.659180 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c682ba54-60b9-4293-ba42-dbde80524daf-serving-cert\") pod \"controller-manager-7fd798dd88-7v9zs\" (UID: \"c682ba54-60b9-4293-ba42-dbde80524daf\") " pod="openshift-controller-manager/controller-manager-7fd798dd88-7v9zs" Mar 16 00:12:00 crc kubenswrapper[4816]: I0316 00:12:00.670001 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z8d5j\" (UniqueName: \"kubernetes.io/projected/c682ba54-60b9-4293-ba42-dbde80524daf-kube-api-access-z8d5j\") pod \"controller-manager-7fd798dd88-7v9zs\" (UID: \"c682ba54-60b9-4293-ba42-dbde80524daf\") " pod="openshift-controller-manager/controller-manager-7fd798dd88-7v9zs" Mar 16 00:12:00 crc kubenswrapper[4816]: I0316 00:12:00.688894 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7fd798dd88-7v9zs" Mar 16 00:12:01 crc kubenswrapper[4816]: I0316 00:12:01.163216 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7dc5f6d8dc-9cs6b" event={"ID":"a8a656b3-b809-44ee-bcc8-cfec2ae27ec7","Type":"ContainerDied","Data":"29366548a9919c43dc76196eb98a6cdaf85a57ba7ece5ccd1f7de91db89bc729"} Mar 16 00:12:01 crc kubenswrapper[4816]: I0316 00:12:01.163301 4816 scope.go:117] "RemoveContainer" containerID="1622414dc19bed94547791fb46aea5e67b087d0109b950fdff872fc6af3fe300" Mar 16 00:12:01 crc kubenswrapper[4816]: I0316 00:12:01.163332 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7dc5f6d8dc-9cs6b" Mar 16 00:12:01 crc kubenswrapper[4816]: I0316 00:12:01.199401 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-7dc5f6d8dc-9cs6b"] Mar 16 00:12:01 crc kubenswrapper[4816]: I0316 00:12:01.202011 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-7dc5f6d8dc-9cs6b"] Mar 16 00:12:01 crc kubenswrapper[4816]: I0316 00:12:01.678899 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a8a656b3-b809-44ee-bcc8-cfec2ae27ec7" path="/var/lib/kubelet/pods/a8a656b3-b809-44ee-bcc8-cfec2ae27ec7/volumes" Mar 16 00:12:01 crc kubenswrapper[4816]: I0316 00:12:01.862737 4816 patch_prober.go:28] interesting pod/machine-config-daemon-jrdcz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 16 00:12:01 crc kubenswrapper[4816]: I0316 00:12:01.862796 4816 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jrdcz" podUID="dd08ece2-7636-4966-973a-e96a34b70b53" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 16 00:12:01 crc kubenswrapper[4816]: I0316 00:12:01.862841 4816 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-jrdcz" Mar 16 00:12:01 crc kubenswrapper[4816]: I0316 00:12:01.863456 4816 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"7003a4592a48f87ab63e9f5637c7665040f9784a7c8f599c82ed61270ddb793c"} pod="openshift-machine-config-operator/machine-config-daemon-jrdcz" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 16 00:12:01 crc kubenswrapper[4816]: I0316 00:12:01.863513 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-jrdcz" podUID="dd08ece2-7636-4966-973a-e96a34b70b53" containerName="machine-config-daemon" containerID="cri-o://7003a4592a48f87ab63e9f5637c7665040f9784a7c8f599c82ed61270ddb793c" gracePeriod=600 Mar 16 00:12:02 crc kubenswrapper[4816]: I0316 00:12:02.971415 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 16 00:12:03 crc kubenswrapper[4816]: I0316 00:12:03.112012 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/2f10e41c-e6db-4083-bb86-ed0d39cc1a5a-kubelet-dir\") pod \"2f10e41c-e6db-4083-bb86-ed0d39cc1a5a\" (UID: \"2f10e41c-e6db-4083-bb86-ed0d39cc1a5a\") " Mar 16 00:12:03 crc kubenswrapper[4816]: I0316 00:12:03.112192 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2f10e41c-e6db-4083-bb86-ed0d39cc1a5a-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "2f10e41c-e6db-4083-bb86-ed0d39cc1a5a" (UID: "2f10e41c-e6db-4083-bb86-ed0d39cc1a5a"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 16 00:12:03 crc kubenswrapper[4816]: I0316 00:12:03.112305 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/2f10e41c-e6db-4083-bb86-ed0d39cc1a5a-kube-api-access\") pod \"2f10e41c-e6db-4083-bb86-ed0d39cc1a5a\" (UID: \"2f10e41c-e6db-4083-bb86-ed0d39cc1a5a\") " Mar 16 00:12:03 crc kubenswrapper[4816]: I0316 00:12:03.113985 4816 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/2f10e41c-e6db-4083-bb86-ed0d39cc1a5a-kubelet-dir\") on node \"crc\" DevicePath \"\"" Mar 16 00:12:03 crc kubenswrapper[4816]: I0316 00:12:03.130037 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2f10e41c-e6db-4083-bb86-ed0d39cc1a5a-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "2f10e41c-e6db-4083-bb86-ed0d39cc1a5a" (UID: "2f10e41c-e6db-4083-bb86-ed0d39cc1a5a"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:12:03 crc kubenswrapper[4816]: I0316 00:12:03.177766 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"2f10e41c-e6db-4083-bb86-ed0d39cc1a5a","Type":"ContainerDied","Data":"934833059dddc073b4415862e96aafb4ed1091c7c9cabca244d231fa20d34d92"} Mar 16 00:12:03 crc kubenswrapper[4816]: I0316 00:12:03.177807 4816 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="934833059dddc073b4415862e96aafb4ed1091c7c9cabca244d231fa20d34d92" Mar 16 00:12:03 crc kubenswrapper[4816]: I0316 00:12:03.177815 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 16 00:12:03 crc kubenswrapper[4816]: I0316 00:12:03.180407 4816 generic.go:334] "Generic (PLEG): container finished" podID="dd08ece2-7636-4966-973a-e96a34b70b53" containerID="7003a4592a48f87ab63e9f5637c7665040f9784a7c8f599c82ed61270ddb793c" exitCode=0 Mar 16 00:12:03 crc kubenswrapper[4816]: I0316 00:12:03.180464 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-jrdcz" event={"ID":"dd08ece2-7636-4966-973a-e96a34b70b53","Type":"ContainerDied","Data":"7003a4592a48f87ab63e9f5637c7665040f9784a7c8f599c82ed61270ddb793c"} Mar 16 00:12:03 crc kubenswrapper[4816]: I0316 00:12:03.215539 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/2f10e41c-e6db-4083-bb86-ed0d39cc1a5a-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 16 00:12:03 crc kubenswrapper[4816]: I0316 00:12:03.233774 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-7fd798dd88-7v9zs"] Mar 16 00:12:03 crc kubenswrapper[4816]: I0316 00:12:03.482992 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29560332-wb8kg"] Mar 16 00:12:03 crc kubenswrapper[4816]: W0316 00:12:03.594311 4816 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode570fb38_3e4c_4b9b_82d9_878ec6a5306f.slice/crio-8490369d24788dc7f985ef3f4779fdeb1e0c4853fea51ce2d69595047a963516 WatchSource:0}: Error finding container 8490369d24788dc7f985ef3f4779fdeb1e0c4853fea51ce2d69595047a963516: Status 404 returned error can't find the container with id 8490369d24788dc7f985ef3f4779fdeb1e0c4853fea51ce2d69595047a963516 Mar 16 00:12:04 crc kubenswrapper[4816]: I0316 00:12:04.206313 4816 generic.go:334] "Generic (PLEG): container finished" podID="bf586c6e-f957-46fc-8140-f9a9ea22510f" containerID="ac8cc1ccd360062b03be48af7d20fc5e22baa578d5f6d15342b7e9dcce308a09" exitCode=0 Mar 16 00:12:04 crc kubenswrapper[4816]: I0316 00:12:04.206392 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8q2xw" event={"ID":"bf586c6e-f957-46fc-8140-f9a9ea22510f","Type":"ContainerDied","Data":"ac8cc1ccd360062b03be48af7d20fc5e22baa578d5f6d15342b7e9dcce308a09"} Mar 16 00:12:04 crc kubenswrapper[4816]: I0316 00:12:04.210910 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7fd798dd88-7v9zs" event={"ID":"c682ba54-60b9-4293-ba42-dbde80524daf","Type":"ContainerStarted","Data":"7cefae33906fb5e888e4b14f5d736ee34b5be1bef6cc5d12987073a3493715d9"} Mar 16 00:12:04 crc kubenswrapper[4816]: I0316 00:12:04.210954 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7fd798dd88-7v9zs" event={"ID":"c682ba54-60b9-4293-ba42-dbde80524daf","Type":"ContainerStarted","Data":"3c1f3c1914f9bb0abf88eda5dfdf58f6ce50fa7199c9993ff27cf3aef4e09894"} Mar 16 00:12:04 crc kubenswrapper[4816]: I0316 00:12:04.211635 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-7fd798dd88-7v9zs" Mar 16 00:12:04 crc kubenswrapper[4816]: I0316 00:12:04.217633 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29560332-wb8kg" event={"ID":"e570fb38-3e4c-4b9b-82d9-878ec6a5306f","Type":"ContainerStarted","Data":"8490369d24788dc7f985ef3f4779fdeb1e0c4853fea51ce2d69595047a963516"} Mar 16 00:12:04 crc kubenswrapper[4816]: I0316 00:12:04.223348 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-jrdcz" event={"ID":"dd08ece2-7636-4966-973a-e96a34b70b53","Type":"ContainerStarted","Data":"8214b8a7550606e587b215ee7c72e3638e054dd083cb6fa7b37990d33bec509b"} Mar 16 00:12:04 crc kubenswrapper[4816]: I0316 00:12:04.234050 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-7fd798dd88-7v9zs" Mar 16 00:12:04 crc kubenswrapper[4816]: I0316 00:12:04.274925 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-7fd798dd88-7v9zs" podStartSLOduration=43.274903666 podStartE2EDuration="43.274903666s" podCreationTimestamp="2026-03-16 00:11:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-16 00:12:04.260856449 +0000 UTC m=+317.357156402" watchObservedRunningTime="2026-03-16 00:12:04.274903666 +0000 UTC m=+317.371203659" Mar 16 00:12:04 crc kubenswrapper[4816]: I0316 00:12:04.753357 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-7954f5f757-5rr7c" Mar 16 00:12:05 crc kubenswrapper[4816]: I0316 00:12:05.235277 4816 generic.go:334] "Generic (PLEG): container finished" podID="ad80e1a9-75dc-4860-9bd9-d59b0c0ae43c" containerID="67c2ff8f6f6f445d4fd7c00ab7519136e896d1421d5929d43ac69cca289fa317" exitCode=0 Mar 16 00:12:05 crc kubenswrapper[4816]: I0316 00:12:05.235362 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4gwcw" event={"ID":"ad80e1a9-75dc-4860-9bd9-d59b0c0ae43c","Type":"ContainerDied","Data":"67c2ff8f6f6f445d4fd7c00ab7519136e896d1421d5929d43ac69cca289fa317"} Mar 16 00:12:05 crc kubenswrapper[4816]: I0316 00:12:05.238739 4816 generic.go:334] "Generic (PLEG): container finished" podID="ff69863d-13e1-444c-ba61-6d68a509a203" containerID="74ea61ccf0b15157a029e2724ad851af86bb050fe43292729cc5d2b513ea7141" exitCode=0 Mar 16 00:12:05 crc kubenswrapper[4816]: I0316 00:12:05.238813 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bkxpc" event={"ID":"ff69863d-13e1-444c-ba61-6d68a509a203","Type":"ContainerDied","Data":"74ea61ccf0b15157a029e2724ad851af86bb050fe43292729cc5d2b513ea7141"} Mar 16 00:12:05 crc kubenswrapper[4816]: I0316 00:12:05.242210 4816 generic.go:334] "Generic (PLEG): container finished" podID="b1b3efd0-cdc0-4973-8077-bcd1ea567bdd" containerID="c3b7cea0d43489debd23962525e8aaaf02e629e9ef59a654ba5aa278317285e1" exitCode=0 Mar 16 00:12:05 crc kubenswrapper[4816]: I0316 00:12:05.242281 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wh2h7" event={"ID":"b1b3efd0-cdc0-4973-8077-bcd1ea567bdd","Type":"ContainerDied","Data":"c3b7cea0d43489debd23962525e8aaaf02e629e9ef59a654ba5aa278317285e1"} Mar 16 00:12:05 crc kubenswrapper[4816]: I0316 00:12:05.246526 4816 generic.go:334] "Generic (PLEG): container finished" podID="bad7b5f7-88a8-4c20-a010-734a46f59e05" containerID="92ce11f74b2381302bcae2babd96b3eab76e1d28bfb034c70d8b99be8178dac1" exitCode=0 Mar 16 00:12:05 crc kubenswrapper[4816]: I0316 00:12:05.246596 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6gbkl" event={"ID":"bad7b5f7-88a8-4c20-a010-734a46f59e05","Type":"ContainerDied","Data":"92ce11f74b2381302bcae2babd96b3eab76e1d28bfb034c70d8b99be8178dac1"} Mar 16 00:12:05 crc kubenswrapper[4816]: I0316 00:12:05.256795 4816 generic.go:334] "Generic (PLEG): container finished" podID="cc1ea93d-1cf8-4145-ad35-83f2d1357f9d" containerID="12e67fc1baf84e28d7eb14a44704825a68bd0357e121983c70625a0778be907a" exitCode=0 Mar 16 00:12:05 crc kubenswrapper[4816]: I0316 00:12:05.256875 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hvpqn" event={"ID":"cc1ea93d-1cf8-4145-ad35-83f2d1357f9d","Type":"ContainerDied","Data":"12e67fc1baf84e28d7eb14a44704825a68bd0357e121983c70625a0778be907a"} Mar 16 00:12:05 crc kubenswrapper[4816]: I0316 00:12:05.266588 4816 generic.go:334] "Generic (PLEG): container finished" podID="6ca6c2c9-3a12-4eb3-9df1-7fdea640791d" containerID="43853e41b6150efe99d4e5270bddef069e33c9677ee5b6b76a29e50bb9d6dc72" exitCode=0 Mar 16 00:12:05 crc kubenswrapper[4816]: I0316 00:12:05.266846 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-52qs6" event={"ID":"6ca6c2c9-3a12-4eb3-9df1-7fdea640791d","Type":"ContainerDied","Data":"43853e41b6150efe99d4e5270bddef069e33c9677ee5b6b76a29e50bb9d6dc72"} Mar 16 00:12:05 crc kubenswrapper[4816]: I0316 00:12:05.272620 4816 generic.go:334] "Generic (PLEG): container finished" podID="a5ba22dd-8e8e-4beb-a540-e5c9687810b8" containerID="908485a9ff25d0805bbaf35b08443a41a047722c7887492c6b9436d8c8c3aabc" exitCode=0 Mar 16 00:12:05 crc kubenswrapper[4816]: I0316 00:12:05.272727 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7pb49" event={"ID":"a5ba22dd-8e8e-4beb-a540-e5c9687810b8","Type":"ContainerDied","Data":"908485a9ff25d0805bbaf35b08443a41a047722c7887492c6b9436d8c8c3aabc"} Mar 16 00:12:05 crc kubenswrapper[4816]: I0316 00:12:05.295105 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29560332-wb8kg" podStartSLOduration=4.072691934 podStartE2EDuration="5.295085317s" podCreationTimestamp="2026-03-16 00:12:00 +0000 UTC" firstStartedPulling="2026-03-16 00:12:03.706500921 +0000 UTC m=+316.802800874" lastFinishedPulling="2026-03-16 00:12:04.928894294 +0000 UTC m=+318.025194257" observedRunningTime="2026-03-16 00:12:05.29411605 +0000 UTC m=+318.390416003" watchObservedRunningTime="2026-03-16 00:12:05.295085317 +0000 UTC m=+318.391385270" Mar 16 00:12:06 crc kubenswrapper[4816]: I0316 00:12:06.283142 4816 generic.go:334] "Generic (PLEG): container finished" podID="e570fb38-3e4c-4b9b-82d9-878ec6a5306f" containerID="c422afc027f6d729cf317777cce7cb5de5ed92334512743c933f67e04e4724ef" exitCode=0 Mar 16 00:12:06 crc kubenswrapper[4816]: I0316 00:12:06.283334 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29560332-wb8kg" event={"ID":"e570fb38-3e4c-4b9b-82d9-878ec6a5306f","Type":"ContainerDied","Data":"c422afc027f6d729cf317777cce7cb5de5ed92334512743c933f67e04e4724ef"} Mar 16 00:12:06 crc kubenswrapper[4816]: I0316 00:12:06.286540 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8q2xw" event={"ID":"bf586c6e-f957-46fc-8140-f9a9ea22510f","Type":"ContainerStarted","Data":"050311929b94abcb5fed29f67a5c5b0ea4b9aaa7f08d32d15808bc1dd56bc7c0"} Mar 16 00:12:06 crc kubenswrapper[4816]: I0316 00:12:06.324650 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-8q2xw" podStartSLOduration=8.479037156 podStartE2EDuration="1m24.324621535s" podCreationTimestamp="2026-03-16 00:10:42 +0000 UTC" firstStartedPulling="2026-03-16 00:10:49.241831966 +0000 UTC m=+242.338131939" lastFinishedPulling="2026-03-16 00:12:05.087416365 +0000 UTC m=+318.183716318" observedRunningTime="2026-03-16 00:12:06.323016141 +0000 UTC m=+319.419316114" watchObservedRunningTime="2026-03-16 00:12:06.324621535 +0000 UTC m=+319.420921498" Mar 16 00:12:07 crc kubenswrapper[4816]: I0316 00:12:07.298026 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bkxpc" event={"ID":"ff69863d-13e1-444c-ba61-6d68a509a203","Type":"ContainerStarted","Data":"cf2475ba248dac35fe8355d380068957ad2cdb3c9fd96c04caf12282b0bbdb1f"} Mar 16 00:12:07 crc kubenswrapper[4816]: I0316 00:12:07.301531 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-52qs6" event={"ID":"6ca6c2c9-3a12-4eb3-9df1-7fdea640791d","Type":"ContainerStarted","Data":"f63537cc995d8e268cbd368b29b4cc5be951232e62b58e296828738dd881f0b2"} Mar 16 00:12:07 crc kubenswrapper[4816]: I0316 00:12:07.304837 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4gwcw" event={"ID":"ad80e1a9-75dc-4860-9bd9-d59b0c0ae43c","Type":"ContainerStarted","Data":"1907f6f84400f8d2fe767c5f795be3bd07851337cba8cf48da1973d87467affc"} Mar 16 00:12:07 crc kubenswrapper[4816]: I0316 00:12:07.328622 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-bkxpc" podStartSLOduration=8.211844836 podStartE2EDuration="1m25.328588499s" podCreationTimestamp="2026-03-16 00:10:42 +0000 UTC" firstStartedPulling="2026-03-16 00:10:49.242420622 +0000 UTC m=+242.338720585" lastFinishedPulling="2026-03-16 00:12:06.359164305 +0000 UTC m=+319.455464248" observedRunningTime="2026-03-16 00:12:07.327751616 +0000 UTC m=+320.424051589" watchObservedRunningTime="2026-03-16 00:12:07.328588499 +0000 UTC m=+320.424888452" Mar 16 00:12:07 crc kubenswrapper[4816]: I0316 00:12:07.349154 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-4gwcw" podStartSLOduration=3.546257089 podStartE2EDuration="1m25.349128634s" podCreationTimestamp="2026-03-16 00:10:42 +0000 UTC" firstStartedPulling="2026-03-16 00:10:44.443817187 +0000 UTC m=+237.540117140" lastFinishedPulling="2026-03-16 00:12:06.246688742 +0000 UTC m=+319.342988685" observedRunningTime="2026-03-16 00:12:07.348172958 +0000 UTC m=+320.444472931" watchObservedRunningTime="2026-03-16 00:12:07.349128634 +0000 UTC m=+320.445428587" Mar 16 00:12:07 crc kubenswrapper[4816]: I0316 00:12:07.373881 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-52qs6" podStartSLOduration=12.319315376 podStartE2EDuration="1m22.373856555s" podCreationTimestamp="2026-03-16 00:10:45 +0000 UTC" firstStartedPulling="2026-03-16 00:10:56.720845664 +0000 UTC m=+249.817145627" lastFinishedPulling="2026-03-16 00:12:06.775386853 +0000 UTC m=+319.871686806" observedRunningTime="2026-03-16 00:12:07.37113234 +0000 UTC m=+320.467432293" watchObservedRunningTime="2026-03-16 00:12:07.373856555 +0000 UTC m=+320.470156508" Mar 16 00:12:07 crc kubenswrapper[4816]: I0316 00:12:07.657132 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29560332-wb8kg" Mar 16 00:12:07 crc kubenswrapper[4816]: I0316 00:12:07.809153 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-55wrt\" (UniqueName: \"kubernetes.io/projected/e570fb38-3e4c-4b9b-82d9-878ec6a5306f-kube-api-access-55wrt\") pod \"e570fb38-3e4c-4b9b-82d9-878ec6a5306f\" (UID: \"e570fb38-3e4c-4b9b-82d9-878ec6a5306f\") " Mar 16 00:12:07 crc kubenswrapper[4816]: I0316 00:12:07.813840 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e570fb38-3e4c-4b9b-82d9-878ec6a5306f-kube-api-access-55wrt" (OuterVolumeSpecName: "kube-api-access-55wrt") pod "e570fb38-3e4c-4b9b-82d9-878ec6a5306f" (UID: "e570fb38-3e4c-4b9b-82d9-878ec6a5306f"). InnerVolumeSpecName "kube-api-access-55wrt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:12:07 crc kubenswrapper[4816]: I0316 00:12:07.911974 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-55wrt\" (UniqueName: \"kubernetes.io/projected/e570fb38-3e4c-4b9b-82d9-878ec6a5306f-kube-api-access-55wrt\") on node \"crc\" DevicePath \"\"" Mar 16 00:12:08 crc kubenswrapper[4816]: I0316 00:12:08.310266 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29560332-wb8kg" event={"ID":"e570fb38-3e4c-4b9b-82d9-878ec6a5306f","Type":"ContainerDied","Data":"8490369d24788dc7f985ef3f4779fdeb1e0c4853fea51ce2d69595047a963516"} Mar 16 00:12:08 crc kubenswrapper[4816]: I0316 00:12:08.310317 4816 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8490369d24788dc7f985ef3f4779fdeb1e0c4853fea51ce2d69595047a963516" Mar 16 00:12:08 crc kubenswrapper[4816]: I0316 00:12:08.310366 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29560332-wb8kg" Mar 16 00:12:08 crc kubenswrapper[4816]: I0316 00:12:08.330955 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7pb49" event={"ID":"a5ba22dd-8e8e-4beb-a540-e5c9687810b8","Type":"ContainerStarted","Data":"625b9a78bb0b854582527b25363acb2b99ec915fd9386d255757cbfa80fd76bf"} Mar 16 00:12:08 crc kubenswrapper[4816]: I0316 00:12:08.333267 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wh2h7" event={"ID":"b1b3efd0-cdc0-4973-8077-bcd1ea567bdd","Type":"ContainerStarted","Data":"0362950976a76988474476b81bd7730cbe780ac154e5a2f2044e4e909795351d"} Mar 16 00:12:08 crc kubenswrapper[4816]: I0316 00:12:08.351773 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-7pb49" podStartSLOduration=11.679330523 podStartE2EDuration="1m24.351757713s" podCreationTimestamp="2026-03-16 00:10:44 +0000 UTC" firstStartedPulling="2026-03-16 00:10:54.620394536 +0000 UTC m=+247.716694489" lastFinishedPulling="2026-03-16 00:12:07.292821726 +0000 UTC m=+320.389121679" observedRunningTime="2026-03-16 00:12:08.347423603 +0000 UTC m=+321.443723556" watchObservedRunningTime="2026-03-16 00:12:08.351757713 +0000 UTC m=+321.448057666" Mar 16 00:12:08 crc kubenswrapper[4816]: I0316 00:12:08.364036 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-wh2h7" podStartSLOduration=8.055107686 podStartE2EDuration="1m26.36402418s" podCreationTimestamp="2026-03-16 00:10:42 +0000 UTC" firstStartedPulling="2026-03-16 00:10:49.24233088 +0000 UTC m=+242.338630843" lastFinishedPulling="2026-03-16 00:12:07.551247384 +0000 UTC m=+320.647547337" observedRunningTime="2026-03-16 00:12:08.361658145 +0000 UTC m=+321.457958098" watchObservedRunningTime="2026-03-16 00:12:08.36402418 +0000 UTC m=+321.460324123" Mar 16 00:12:11 crc kubenswrapper[4816]: I0316 00:12:11.361930 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6gbkl" event={"ID":"bad7b5f7-88a8-4c20-a010-734a46f59e05","Type":"ContainerStarted","Data":"9cbc70d2e0b275d40fbacb6be14712c60796f46bdd73e4f108a004a37c120cb9"} Mar 16 00:12:12 crc kubenswrapper[4816]: I0316 00:12:12.397584 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-6gbkl" podStartSLOduration=15.860012539 podStartE2EDuration="1m28.397531805s" podCreationTimestamp="2026-03-16 00:10:44 +0000 UTC" firstStartedPulling="2026-03-16 00:10:56.721284176 +0000 UTC m=+249.817584129" lastFinishedPulling="2026-03-16 00:12:09.258803442 +0000 UTC m=+322.355103395" observedRunningTime="2026-03-16 00:12:12.392344142 +0000 UTC m=+325.488644095" watchObservedRunningTime="2026-03-16 00:12:12.397531805 +0000 UTC m=+325.493831778" Mar 16 00:12:12 crc kubenswrapper[4816]: I0316 00:12:12.666374 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-4gwcw" Mar 16 00:12:12 crc kubenswrapper[4816]: I0316 00:12:12.666698 4816 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-4gwcw" Mar 16 00:12:13 crc kubenswrapper[4816]: I0316 00:12:13.017450 4816 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-wh2h7" Mar 16 00:12:13 crc kubenswrapper[4816]: I0316 00:12:13.017503 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-wh2h7" Mar 16 00:12:13 crc kubenswrapper[4816]: I0316 00:12:13.087437 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-bkxpc" Mar 16 00:12:13 crc kubenswrapper[4816]: I0316 00:12:13.088003 4816 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-bkxpc" Mar 16 00:12:13 crc kubenswrapper[4816]: I0316 00:12:13.272459 4816 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-8q2xw" Mar 16 00:12:13 crc kubenswrapper[4816]: I0316 00:12:13.272714 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-8q2xw" Mar 16 00:12:13 crc kubenswrapper[4816]: I0316 00:12:13.378142 4816 generic.go:334] "Generic (PLEG): container finished" podID="9fc59286-0388-4519-afc7-f2c8cf80ab40" containerID="a30805e487fac9e751dab1510445d1b512d8b7784f8e73df1f67f72887178e24" exitCode=0 Mar 16 00:12:13 crc kubenswrapper[4816]: I0316 00:12:13.378219 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-pruner-29560320-s9q72" event={"ID":"9fc59286-0388-4519-afc7-f2c8cf80ab40","Type":"ContainerDied","Data":"a30805e487fac9e751dab1510445d1b512d8b7784f8e73df1f67f72887178e24"} Mar 16 00:12:13 crc kubenswrapper[4816]: I0316 00:12:13.380930 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hvpqn" event={"ID":"cc1ea93d-1cf8-4145-ad35-83f2d1357f9d","Type":"ContainerStarted","Data":"8f95ead769819114b5324ad74b013a299738b066a23d9b7aab0526d5b3f15f3a"} Mar 16 00:12:13 crc kubenswrapper[4816]: I0316 00:12:13.426893 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-hvpqn" podStartSLOduration=11.489613704 podStartE2EDuration="1m27.426870591s" podCreationTimestamp="2026-03-16 00:10:46 +0000 UTC" firstStartedPulling="2026-03-16 00:10:56.668778093 +0000 UTC m=+249.765078046" lastFinishedPulling="2026-03-16 00:12:12.60603498 +0000 UTC m=+325.702334933" observedRunningTime="2026-03-16 00:12:13.423294753 +0000 UTC m=+326.519594706" watchObservedRunningTime="2026-03-16 00:12:13.426870591 +0000 UTC m=+326.523170544" Mar 16 00:12:13 crc kubenswrapper[4816]: I0316 00:12:13.429797 4816 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-8q2xw" Mar 16 00:12:13 crc kubenswrapper[4816]: I0316 00:12:13.430608 4816 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-bkxpc" Mar 16 00:12:13 crc kubenswrapper[4816]: I0316 00:12:13.430966 4816 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-wh2h7" Mar 16 00:12:13 crc kubenswrapper[4816]: I0316 00:12:13.434330 4816 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-4gwcw" Mar 16 00:12:13 crc kubenswrapper[4816]: I0316 00:12:13.482835 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-wh2h7" Mar 16 00:12:13 crc kubenswrapper[4816]: I0316 00:12:13.492786 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-8q2xw" Mar 16 00:12:13 crc kubenswrapper[4816]: I0316 00:12:13.493716 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-4gwcw" Mar 16 00:12:14 crc kubenswrapper[4816]: I0316 00:12:14.431923 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-bkxpc" Mar 16 00:12:14 crc kubenswrapper[4816]: I0316 00:12:14.655236 4816 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-7pb49" Mar 16 00:12:14 crc kubenswrapper[4816]: I0316 00:12:14.655291 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-7pb49" Mar 16 00:12:14 crc kubenswrapper[4816]: I0316 00:12:14.694474 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-pruner-29560320-s9q72" Mar 16 00:12:14 crc kubenswrapper[4816]: I0316 00:12:14.701426 4816 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-7pb49" Mar 16 00:12:14 crc kubenswrapper[4816]: I0316 00:12:14.816087 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/9fc59286-0388-4519-afc7-f2c8cf80ab40-serviceca\") pod \"9fc59286-0388-4519-afc7-f2c8cf80ab40\" (UID: \"9fc59286-0388-4519-afc7-f2c8cf80ab40\") " Mar 16 00:12:14 crc kubenswrapper[4816]: I0316 00:12:14.816217 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2tj47\" (UniqueName: \"kubernetes.io/projected/9fc59286-0388-4519-afc7-f2c8cf80ab40-kube-api-access-2tj47\") pod \"9fc59286-0388-4519-afc7-f2c8cf80ab40\" (UID: \"9fc59286-0388-4519-afc7-f2c8cf80ab40\") " Mar 16 00:12:14 crc kubenswrapper[4816]: I0316 00:12:14.817928 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9fc59286-0388-4519-afc7-f2c8cf80ab40-serviceca" (OuterVolumeSpecName: "serviceca") pod "9fc59286-0388-4519-afc7-f2c8cf80ab40" (UID: "9fc59286-0388-4519-afc7-f2c8cf80ab40"). InnerVolumeSpecName "serviceca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:12:14 crc kubenswrapper[4816]: I0316 00:12:14.822387 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9fc59286-0388-4519-afc7-f2c8cf80ab40-kube-api-access-2tj47" (OuterVolumeSpecName: "kube-api-access-2tj47") pod "9fc59286-0388-4519-afc7-f2c8cf80ab40" (UID: "9fc59286-0388-4519-afc7-f2c8cf80ab40"). InnerVolumeSpecName "kube-api-access-2tj47". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:12:14 crc kubenswrapper[4816]: I0316 00:12:14.917656 4816 reconciler_common.go:293] "Volume detached for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/9fc59286-0388-4519-afc7-f2c8cf80ab40-serviceca\") on node \"crc\" DevicePath \"\"" Mar 16 00:12:14 crc kubenswrapper[4816]: I0316 00:12:14.917691 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2tj47\" (UniqueName: \"kubernetes.io/projected/9fc59286-0388-4519-afc7-f2c8cf80ab40-kube-api-access-2tj47\") on node \"crc\" DevicePath \"\"" Mar 16 00:12:15 crc kubenswrapper[4816]: I0316 00:12:15.098603 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-6gbkl" Mar 16 00:12:15 crc kubenswrapper[4816]: I0316 00:12:15.099541 4816 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-6gbkl" Mar 16 00:12:15 crc kubenswrapper[4816]: I0316 00:12:15.148130 4816 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-6gbkl" Mar 16 00:12:15 crc kubenswrapper[4816]: I0316 00:12:15.396098 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-pruner-29560320-s9q72" Mar 16 00:12:15 crc kubenswrapper[4816]: I0316 00:12:15.396305 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-pruner-29560320-s9q72" event={"ID":"9fc59286-0388-4519-afc7-f2c8cf80ab40","Type":"ContainerDied","Data":"469ef439f1bc4e49165115c6fecd0f6feec675c1f680294bca4301ee3520daee"} Mar 16 00:12:15 crc kubenswrapper[4816]: I0316 00:12:15.397737 4816 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="469ef439f1bc4e49165115c6fecd0f6feec675c1f680294bca4301ee3520daee" Mar 16 00:12:15 crc kubenswrapper[4816]: I0316 00:12:15.452059 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-7pb49" Mar 16 00:12:15 crc kubenswrapper[4816]: I0316 00:12:15.455358 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-6gbkl" Mar 16 00:12:16 crc kubenswrapper[4816]: I0316 00:12:16.084594 4816 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-52qs6" Mar 16 00:12:16 crc kubenswrapper[4816]: I0316 00:12:16.085722 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-52qs6" Mar 16 00:12:16 crc kubenswrapper[4816]: I0316 00:12:16.134806 4816 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-52qs6" Mar 16 00:12:16 crc kubenswrapper[4816]: I0316 00:12:16.461923 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-52qs6" Mar 16 00:12:16 crc kubenswrapper[4816]: I0316 00:12:16.472818 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-hvpqn" Mar 16 00:12:16 crc kubenswrapper[4816]: I0316 00:12:16.473191 4816 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-hvpqn" Mar 16 00:12:17 crc kubenswrapper[4816]: I0316 00:12:17.473454 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-bkxpc"] Mar 16 00:12:17 crc kubenswrapper[4816]: I0316 00:12:17.474222 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-bkxpc" podUID="ff69863d-13e1-444c-ba61-6d68a509a203" containerName="registry-server" containerID="cri-o://cf2475ba248dac35fe8355d380068957ad2cdb3c9fd96c04caf12282b0bbdb1f" gracePeriod=2 Mar 16 00:12:17 crc kubenswrapper[4816]: I0316 00:12:17.517704 4816 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-hvpqn" podUID="cc1ea93d-1cf8-4145-ad35-83f2d1357f9d" containerName="registry-server" probeResult="failure" output=< Mar 16 00:12:17 crc kubenswrapper[4816]: timeout: failed to connect service ":50051" within 1s Mar 16 00:12:17 crc kubenswrapper[4816]: > Mar 16 00:12:17 crc kubenswrapper[4816]: I0316 00:12:17.676058 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-8q2xw"] Mar 16 00:12:17 crc kubenswrapper[4816]: I0316 00:12:17.676379 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-8q2xw" podUID="bf586c6e-f957-46fc-8140-f9a9ea22510f" containerName="registry-server" containerID="cri-o://050311929b94abcb5fed29f67a5c5b0ea4b9aaa7f08d32d15808bc1dd56bc7c0" gracePeriod=2 Mar 16 00:12:18 crc kubenswrapper[4816]: I0316 00:12:18.439381 4816 generic.go:334] "Generic (PLEG): container finished" podID="ff69863d-13e1-444c-ba61-6d68a509a203" containerID="cf2475ba248dac35fe8355d380068957ad2cdb3c9fd96c04caf12282b0bbdb1f" exitCode=0 Mar 16 00:12:18 crc kubenswrapper[4816]: I0316 00:12:18.439599 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bkxpc" event={"ID":"ff69863d-13e1-444c-ba61-6d68a509a203","Type":"ContainerDied","Data":"cf2475ba248dac35fe8355d380068957ad2cdb3c9fd96c04caf12282b0bbdb1f"} Mar 16 00:12:18 crc kubenswrapper[4816]: I0316 00:12:18.719833 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 16 00:12:18 crc kubenswrapper[4816]: I0316 00:12:18.720523 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 16 00:12:18 crc kubenswrapper[4816]: I0316 00:12:18.720633 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 16 00:12:18 crc kubenswrapper[4816]: I0316 00:12:18.720727 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 16 00:12:18 crc kubenswrapper[4816]: I0316 00:12:18.722339 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Mar 16 00:12:18 crc kubenswrapper[4816]: I0316 00:12:18.722635 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Mar 16 00:12:18 crc kubenswrapper[4816]: I0316 00:12:18.722872 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Mar 16 00:12:18 crc kubenswrapper[4816]: I0316 00:12:18.732793 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Mar 16 00:12:18 crc kubenswrapper[4816]: I0316 00:12:18.738144 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 16 00:12:18 crc kubenswrapper[4816]: I0316 00:12:18.744860 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 16 00:12:18 crc kubenswrapper[4816]: I0316 00:12:18.745642 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 16 00:12:18 crc kubenswrapper[4816]: I0316 00:12:18.899160 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 16 00:12:18 crc kubenswrapper[4816]: I0316 00:12:18.984996 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 16 00:12:18 crc kubenswrapper[4816]: I0316 00:12:18.994616 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 16 00:12:19 crc kubenswrapper[4816]: I0316 00:12:19.002343 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 16 00:12:19 crc kubenswrapper[4816]: W0316 00:12:19.447562 4816 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3b6479f0_333b_4a96_9adf_2099afdc2447.slice/crio-c40738c61b48d6c4c4636768df0e4d99df2211903494181e2eaab3802f9cdd34 WatchSource:0}: Error finding container c40738c61b48d6c4c4636768df0e4d99df2211903494181e2eaab3802f9cdd34: Status 404 returned error can't find the container with id c40738c61b48d6c4c4636768df0e4d99df2211903494181e2eaab3802f9cdd34 Mar 16 00:12:19 crc kubenswrapper[4816]: I0316 00:12:19.454935 4816 generic.go:334] "Generic (PLEG): container finished" podID="bf586c6e-f957-46fc-8140-f9a9ea22510f" containerID="050311929b94abcb5fed29f67a5c5b0ea4b9aaa7f08d32d15808bc1dd56bc7c0" exitCode=0 Mar 16 00:12:19 crc kubenswrapper[4816]: I0316 00:12:19.454981 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8q2xw" event={"ID":"bf586c6e-f957-46fc-8140-f9a9ea22510f","Type":"ContainerDied","Data":"050311929b94abcb5fed29f67a5c5b0ea4b9aaa7f08d32d15808bc1dd56bc7c0"} Mar 16 00:12:19 crc kubenswrapper[4816]: W0316 00:12:19.474319 4816 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5fe485a1_e14f_4c09_b5b9_f252bc42b7e8.slice/crio-4ebbcb4fc0112955f2090e810b55796a7e29fd750b6da3ad533b188be904fb4c WatchSource:0}: Error finding container 4ebbcb4fc0112955f2090e810b55796a7e29fd750b6da3ad533b188be904fb4c: Status 404 returned error can't find the container with id 4ebbcb4fc0112955f2090e810b55796a7e29fd750b6da3ad533b188be904fb4c Mar 16 00:12:19 crc kubenswrapper[4816]: I0316 00:12:19.580034 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-bkxpc" Mar 16 00:12:19 crc kubenswrapper[4816]: W0316 00:12:19.682736 4816 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9d751cbb_f2e2_430d_9754_c882a5e924a5.slice/crio-d6e80ac2881be293aae0d1cf60a7526311b9ca54768daed4b468ce6957f38c33 WatchSource:0}: Error finding container d6e80ac2881be293aae0d1cf60a7526311b9ca54768daed4b468ce6957f38c33: Status 404 returned error can't find the container with id d6e80ac2881be293aae0d1cf60a7526311b9ca54768daed4b468ce6957f38c33 Mar 16 00:12:19 crc kubenswrapper[4816]: I0316 00:12:19.734166 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ff69863d-13e1-444c-ba61-6d68a509a203-catalog-content\") pod \"ff69863d-13e1-444c-ba61-6d68a509a203\" (UID: \"ff69863d-13e1-444c-ba61-6d68a509a203\") " Mar 16 00:12:19 crc kubenswrapper[4816]: I0316 00:12:19.734254 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ff69863d-13e1-444c-ba61-6d68a509a203-utilities\") pod \"ff69863d-13e1-444c-ba61-6d68a509a203\" (UID: \"ff69863d-13e1-444c-ba61-6d68a509a203\") " Mar 16 00:12:19 crc kubenswrapper[4816]: I0316 00:12:19.734304 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p8frh\" (UniqueName: \"kubernetes.io/projected/ff69863d-13e1-444c-ba61-6d68a509a203-kube-api-access-p8frh\") pod \"ff69863d-13e1-444c-ba61-6d68a509a203\" (UID: \"ff69863d-13e1-444c-ba61-6d68a509a203\") " Mar 16 00:12:19 crc kubenswrapper[4816]: I0316 00:12:19.735813 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ff69863d-13e1-444c-ba61-6d68a509a203-utilities" (OuterVolumeSpecName: "utilities") pod "ff69863d-13e1-444c-ba61-6d68a509a203" (UID: "ff69863d-13e1-444c-ba61-6d68a509a203"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 16 00:12:19 crc kubenswrapper[4816]: I0316 00:12:19.739810 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ff69863d-13e1-444c-ba61-6d68a509a203-kube-api-access-p8frh" (OuterVolumeSpecName: "kube-api-access-p8frh") pod "ff69863d-13e1-444c-ba61-6d68a509a203" (UID: "ff69863d-13e1-444c-ba61-6d68a509a203"). InnerVolumeSpecName "kube-api-access-p8frh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:12:19 crc kubenswrapper[4816]: I0316 00:12:19.836740 4816 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ff69863d-13e1-444c-ba61-6d68a509a203-utilities\") on node \"crc\" DevicePath \"\"" Mar 16 00:12:19 crc kubenswrapper[4816]: I0316 00:12:19.839063 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p8frh\" (UniqueName: \"kubernetes.io/projected/ff69863d-13e1-444c-ba61-6d68a509a203-kube-api-access-p8frh\") on node \"crc\" DevicePath \"\"" Mar 16 00:12:19 crc kubenswrapper[4816]: I0316 00:12:19.874781 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-6gbkl"] Mar 16 00:12:19 crc kubenswrapper[4816]: I0316 00:12:19.875058 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-6gbkl" podUID="bad7b5f7-88a8-4c20-a010-734a46f59e05" containerName="registry-server" containerID="cri-o://9cbc70d2e0b275d40fbacb6be14712c60796f46bdd73e4f108a004a37c120cb9" gracePeriod=2 Mar 16 00:12:20 crc kubenswrapper[4816]: I0316 00:12:20.214433 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-8q2xw" Mar 16 00:12:20 crc kubenswrapper[4816]: I0316 00:12:20.346797 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bf586c6e-f957-46fc-8140-f9a9ea22510f-utilities\") pod \"bf586c6e-f957-46fc-8140-f9a9ea22510f\" (UID: \"bf586c6e-f957-46fc-8140-f9a9ea22510f\") " Mar 16 00:12:20 crc kubenswrapper[4816]: I0316 00:12:20.347491 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8gwq2\" (UniqueName: \"kubernetes.io/projected/bf586c6e-f957-46fc-8140-f9a9ea22510f-kube-api-access-8gwq2\") pod \"bf586c6e-f957-46fc-8140-f9a9ea22510f\" (UID: \"bf586c6e-f957-46fc-8140-f9a9ea22510f\") " Mar 16 00:12:20 crc kubenswrapper[4816]: I0316 00:12:20.347593 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bf586c6e-f957-46fc-8140-f9a9ea22510f-catalog-content\") pod \"bf586c6e-f957-46fc-8140-f9a9ea22510f\" (UID: \"bf586c6e-f957-46fc-8140-f9a9ea22510f\") " Mar 16 00:12:20 crc kubenswrapper[4816]: I0316 00:12:20.347827 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bf586c6e-f957-46fc-8140-f9a9ea22510f-utilities" (OuterVolumeSpecName: "utilities") pod "bf586c6e-f957-46fc-8140-f9a9ea22510f" (UID: "bf586c6e-f957-46fc-8140-f9a9ea22510f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 16 00:12:20 crc kubenswrapper[4816]: I0316 00:12:20.348122 4816 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bf586c6e-f957-46fc-8140-f9a9ea22510f-utilities\") on node \"crc\" DevicePath \"\"" Mar 16 00:12:20 crc kubenswrapper[4816]: I0316 00:12:20.353149 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf586c6e-f957-46fc-8140-f9a9ea22510f-kube-api-access-8gwq2" (OuterVolumeSpecName: "kube-api-access-8gwq2") pod "bf586c6e-f957-46fc-8140-f9a9ea22510f" (UID: "bf586c6e-f957-46fc-8140-f9a9ea22510f"). InnerVolumeSpecName "kube-api-access-8gwq2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:12:20 crc kubenswrapper[4816]: I0316 00:12:20.448859 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8gwq2\" (UniqueName: \"kubernetes.io/projected/bf586c6e-f957-46fc-8140-f9a9ea22510f-kube-api-access-8gwq2\") on node \"crc\" DevicePath \"\"" Mar 16 00:12:20 crc kubenswrapper[4816]: I0316 00:12:20.462648 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8q2xw" event={"ID":"bf586c6e-f957-46fc-8140-f9a9ea22510f","Type":"ContainerDied","Data":"e4ae76a3c7fcca7fb114ac9afc90c35f9554a0225d9ad44974098c92d5909906"} Mar 16 00:12:20 crc kubenswrapper[4816]: I0316 00:12:20.462728 4816 scope.go:117] "RemoveContainer" containerID="050311929b94abcb5fed29f67a5c5b0ea4b9aaa7f08d32d15808bc1dd56bc7c0" Mar 16 00:12:20 crc kubenswrapper[4816]: I0316 00:12:20.462862 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-8q2xw" Mar 16 00:12:20 crc kubenswrapper[4816]: I0316 00:12:20.465208 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"c40738c61b48d6c4c4636768df0e4d99df2211903494181e2eaab3802f9cdd34"} Mar 16 00:12:20 crc kubenswrapper[4816]: I0316 00:12:20.467701 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"d6e80ac2881be293aae0d1cf60a7526311b9ca54768daed4b468ce6957f38c33"} Mar 16 00:12:20 crc kubenswrapper[4816]: I0316 00:12:20.470729 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bkxpc" event={"ID":"ff69863d-13e1-444c-ba61-6d68a509a203","Type":"ContainerDied","Data":"bb49e18eedefb469d6109a46587df4a9ef4eb1e0a35954df9209a551fbb7b5b4"} Mar 16 00:12:20 crc kubenswrapper[4816]: I0316 00:12:20.470848 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-bkxpc" Mar 16 00:12:20 crc kubenswrapper[4816]: I0316 00:12:20.479176 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"4ebbcb4fc0112955f2090e810b55796a7e29fd750b6da3ad533b188be904fb4c"} Mar 16 00:12:20 crc kubenswrapper[4816]: I0316 00:12:20.482055 4816 scope.go:117] "RemoveContainer" containerID="ac8cc1ccd360062b03be48af7d20fc5e22baa578d5f6d15342b7e9dcce308a09" Mar 16 00:12:20 crc kubenswrapper[4816]: I0316 00:12:20.510881 4816 scope.go:117] "RemoveContainer" containerID="307037ba13f42f68192fcc6d4406e472de7d9aac5f7546be49cd42537db26240" Mar 16 00:12:20 crc kubenswrapper[4816]: I0316 00:12:20.540785 4816 scope.go:117] "RemoveContainer" containerID="cf2475ba248dac35fe8355d380068957ad2cdb3c9fd96c04caf12282b0bbdb1f" Mar 16 00:12:20 crc kubenswrapper[4816]: I0316 00:12:20.552786 4816 scope.go:117] "RemoveContainer" containerID="74ea61ccf0b15157a029e2724ad851af86bb050fe43292729cc5d2b513ea7141" Mar 16 00:12:20 crc kubenswrapper[4816]: I0316 00:12:20.564417 4816 scope.go:117] "RemoveContainer" containerID="2d7e1ead92ce8010c6084321e28f13a3b17186a0141a9a086a18947183a41d47" Mar 16 00:12:20 crc kubenswrapper[4816]: I0316 00:12:20.765815 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bf586c6e-f957-46fc-8140-f9a9ea22510f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "bf586c6e-f957-46fc-8140-f9a9ea22510f" (UID: "bf586c6e-f957-46fc-8140-f9a9ea22510f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 16 00:12:20 crc kubenswrapper[4816]: I0316 00:12:20.856335 4816 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bf586c6e-f957-46fc-8140-f9a9ea22510f-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 16 00:12:21 crc kubenswrapper[4816]: I0316 00:12:21.091999 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-8q2xw"] Mar 16 00:12:21 crc kubenswrapper[4816]: I0316 00:12:21.104081 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-8q2xw"] Mar 16 00:12:21 crc kubenswrapper[4816]: I0316 00:12:21.318942 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ff69863d-13e1-444c-ba61-6d68a509a203-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ff69863d-13e1-444c-ba61-6d68a509a203" (UID: "ff69863d-13e1-444c-ba61-6d68a509a203"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 16 00:12:21 crc kubenswrapper[4816]: I0316 00:12:21.361863 4816 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ff69863d-13e1-444c-ba61-6d68a509a203-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 16 00:12:21 crc kubenswrapper[4816]: I0316 00:12:21.413055 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-bkxpc"] Mar 16 00:12:21 crc kubenswrapper[4816]: I0316 00:12:21.413118 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-bkxpc"] Mar 16 00:12:21 crc kubenswrapper[4816]: I0316 00:12:21.485440 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"3ea1184f822884ab69fdfb7a5fe5136882ffc21bf737db72c6911e5ee012e8d7"} Mar 16 00:12:21 crc kubenswrapper[4816]: I0316 00:12:21.487628 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"dc39a4751822c7014b25d13324976e8315a93bbe519111882b17f3808d80cdcc"} Mar 16 00:12:21 crc kubenswrapper[4816]: I0316 00:12:21.499235 4816 generic.go:334] "Generic (PLEG): container finished" podID="bad7b5f7-88a8-4c20-a010-734a46f59e05" containerID="9cbc70d2e0b275d40fbacb6be14712c60796f46bdd73e4f108a004a37c120cb9" exitCode=0 Mar 16 00:12:21 crc kubenswrapper[4816]: I0316 00:12:21.499309 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6gbkl" event={"ID":"bad7b5f7-88a8-4c20-a010-734a46f59e05","Type":"ContainerDied","Data":"9cbc70d2e0b275d40fbacb6be14712c60796f46bdd73e4f108a004a37c120cb9"} Mar 16 00:12:21 crc kubenswrapper[4816]: I0316 00:12:21.499337 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6gbkl" event={"ID":"bad7b5f7-88a8-4c20-a010-734a46f59e05","Type":"ContainerDied","Data":"dfccefcb0f8e6864404f0a8715036becb9b7ec4a3aef59dca2da5e935bde36d5"} Mar 16 00:12:21 crc kubenswrapper[4816]: I0316 00:12:21.499350 4816 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="dfccefcb0f8e6864404f0a8715036becb9b7ec4a3aef59dca2da5e935bde36d5" Mar 16 00:12:21 crc kubenswrapper[4816]: I0316 00:12:21.500816 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"554f8200c740d02a2297f006a6411dc8ec1040cc94013d3eb914e9f1af3bbcc7"} Mar 16 00:12:21 crc kubenswrapper[4816]: I0316 00:12:21.501419 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 16 00:12:21 crc kubenswrapper[4816]: I0316 00:12:21.542837 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-6gbkl" Mar 16 00:12:21 crc kubenswrapper[4816]: I0316 00:12:21.664294 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l2d9d\" (UniqueName: \"kubernetes.io/projected/bad7b5f7-88a8-4c20-a010-734a46f59e05-kube-api-access-l2d9d\") pod \"bad7b5f7-88a8-4c20-a010-734a46f59e05\" (UID: \"bad7b5f7-88a8-4c20-a010-734a46f59e05\") " Mar 16 00:12:21 crc kubenswrapper[4816]: I0316 00:12:21.664433 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bad7b5f7-88a8-4c20-a010-734a46f59e05-catalog-content\") pod \"bad7b5f7-88a8-4c20-a010-734a46f59e05\" (UID: \"bad7b5f7-88a8-4c20-a010-734a46f59e05\") " Mar 16 00:12:21 crc kubenswrapper[4816]: I0316 00:12:21.664490 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bad7b5f7-88a8-4c20-a010-734a46f59e05-utilities\") pod \"bad7b5f7-88a8-4c20-a010-734a46f59e05\" (UID: \"bad7b5f7-88a8-4c20-a010-734a46f59e05\") " Mar 16 00:12:21 crc kubenswrapper[4816]: I0316 00:12:21.674335 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bad7b5f7-88a8-4c20-a010-734a46f59e05-utilities" (OuterVolumeSpecName: "utilities") pod "bad7b5f7-88a8-4c20-a010-734a46f59e05" (UID: "bad7b5f7-88a8-4c20-a010-734a46f59e05"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 16 00:12:21 crc kubenswrapper[4816]: I0316 00:12:21.674666 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bad7b5f7-88a8-4c20-a010-734a46f59e05-kube-api-access-l2d9d" (OuterVolumeSpecName: "kube-api-access-l2d9d") pod "bad7b5f7-88a8-4c20-a010-734a46f59e05" (UID: "bad7b5f7-88a8-4c20-a010-734a46f59e05"). InnerVolumeSpecName "kube-api-access-l2d9d". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:12:21 crc kubenswrapper[4816]: I0316 00:12:21.686204 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf586c6e-f957-46fc-8140-f9a9ea22510f" path="/var/lib/kubelet/pods/bf586c6e-f957-46fc-8140-f9a9ea22510f/volumes" Mar 16 00:12:21 crc kubenswrapper[4816]: I0316 00:12:21.687113 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ff69863d-13e1-444c-ba61-6d68a509a203" path="/var/lib/kubelet/pods/ff69863d-13e1-444c-ba61-6d68a509a203/volumes" Mar 16 00:12:21 crc kubenswrapper[4816]: I0316 00:12:21.712617 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bad7b5f7-88a8-4c20-a010-734a46f59e05-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "bad7b5f7-88a8-4c20-a010-734a46f59e05" (UID: "bad7b5f7-88a8-4c20-a010-734a46f59e05"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 16 00:12:21 crc kubenswrapper[4816]: I0316 00:12:21.766263 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l2d9d\" (UniqueName: \"kubernetes.io/projected/bad7b5f7-88a8-4c20-a010-734a46f59e05-kube-api-access-l2d9d\") on node \"crc\" DevicePath \"\"" Mar 16 00:12:21 crc kubenswrapper[4816]: I0316 00:12:21.766320 4816 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bad7b5f7-88a8-4c20-a010-734a46f59e05-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 16 00:12:21 crc kubenswrapper[4816]: I0316 00:12:21.766335 4816 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bad7b5f7-88a8-4c20-a010-734a46f59e05-utilities\") on node \"crc\" DevicePath \"\"" Mar 16 00:12:22 crc kubenswrapper[4816]: I0316 00:12:22.507849 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-6gbkl" Mar 16 00:12:22 crc kubenswrapper[4816]: I0316 00:12:22.544474 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-6gbkl"] Mar 16 00:12:22 crc kubenswrapper[4816]: I0316 00:12:22.552314 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-6gbkl"] Mar 16 00:12:22 crc kubenswrapper[4816]: I0316 00:12:22.552714 4816 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Mar 16 00:12:22 crc kubenswrapper[4816]: E0316 00:12:22.553050 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e570fb38-3e4c-4b9b-82d9-878ec6a5306f" containerName="oc" Mar 16 00:12:22 crc kubenswrapper[4816]: I0316 00:12:22.553083 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="e570fb38-3e4c-4b9b-82d9-878ec6a5306f" containerName="oc" Mar 16 00:12:22 crc kubenswrapper[4816]: E0316 00:12:22.553104 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9fc59286-0388-4519-afc7-f2c8cf80ab40" containerName="image-pruner" Mar 16 00:12:22 crc kubenswrapper[4816]: I0316 00:12:22.553116 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="9fc59286-0388-4519-afc7-f2c8cf80ab40" containerName="image-pruner" Mar 16 00:12:22 crc kubenswrapper[4816]: E0316 00:12:22.553141 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ff69863d-13e1-444c-ba61-6d68a509a203" containerName="extract-utilities" Mar 16 00:12:22 crc kubenswrapper[4816]: I0316 00:12:22.553156 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="ff69863d-13e1-444c-ba61-6d68a509a203" containerName="extract-utilities" Mar 16 00:12:22 crc kubenswrapper[4816]: E0316 00:12:22.553173 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bf586c6e-f957-46fc-8140-f9a9ea22510f" containerName="extract-content" Mar 16 00:12:22 crc kubenswrapper[4816]: I0316 00:12:22.553185 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="bf586c6e-f957-46fc-8140-f9a9ea22510f" containerName="extract-content" Mar 16 00:12:22 crc kubenswrapper[4816]: E0316 00:12:22.553272 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bad7b5f7-88a8-4c20-a010-734a46f59e05" containerName="registry-server" Mar 16 00:12:22 crc kubenswrapper[4816]: I0316 00:12:22.553286 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="bad7b5f7-88a8-4c20-a010-734a46f59e05" containerName="registry-server" Mar 16 00:12:22 crc kubenswrapper[4816]: E0316 00:12:22.553302 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ff69863d-13e1-444c-ba61-6d68a509a203" containerName="registry-server" Mar 16 00:12:22 crc kubenswrapper[4816]: I0316 00:12:22.553313 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="ff69863d-13e1-444c-ba61-6d68a509a203" containerName="registry-server" Mar 16 00:12:22 crc kubenswrapper[4816]: E0316 00:12:22.553331 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bad7b5f7-88a8-4c20-a010-734a46f59e05" containerName="extract-content" Mar 16 00:12:22 crc kubenswrapper[4816]: I0316 00:12:22.553341 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="bad7b5f7-88a8-4c20-a010-734a46f59e05" containerName="extract-content" Mar 16 00:12:22 crc kubenswrapper[4816]: E0316 00:12:22.553359 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bf586c6e-f957-46fc-8140-f9a9ea22510f" containerName="extract-utilities" Mar 16 00:12:22 crc kubenswrapper[4816]: I0316 00:12:22.553372 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="bf586c6e-f957-46fc-8140-f9a9ea22510f" containerName="extract-utilities" Mar 16 00:12:22 crc kubenswrapper[4816]: E0316 00:12:22.553386 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bad7b5f7-88a8-4c20-a010-734a46f59e05" containerName="extract-utilities" Mar 16 00:12:22 crc kubenswrapper[4816]: I0316 00:12:22.553397 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="bad7b5f7-88a8-4c20-a010-734a46f59e05" containerName="extract-utilities" Mar 16 00:12:22 crc kubenswrapper[4816]: E0316 00:12:22.553411 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bf586c6e-f957-46fc-8140-f9a9ea22510f" containerName="registry-server" Mar 16 00:12:22 crc kubenswrapper[4816]: I0316 00:12:22.553423 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="bf586c6e-f957-46fc-8140-f9a9ea22510f" containerName="registry-server" Mar 16 00:12:22 crc kubenswrapper[4816]: E0316 00:12:22.553439 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2f10e41c-e6db-4083-bb86-ed0d39cc1a5a" containerName="pruner" Mar 16 00:12:22 crc kubenswrapper[4816]: I0316 00:12:22.553450 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="2f10e41c-e6db-4083-bb86-ed0d39cc1a5a" containerName="pruner" Mar 16 00:12:22 crc kubenswrapper[4816]: E0316 00:12:22.553464 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ff69863d-13e1-444c-ba61-6d68a509a203" containerName="extract-content" Mar 16 00:12:22 crc kubenswrapper[4816]: I0316 00:12:22.553475 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="ff69863d-13e1-444c-ba61-6d68a509a203" containerName="extract-content" Mar 16 00:12:22 crc kubenswrapper[4816]: I0316 00:12:22.553743 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="bad7b5f7-88a8-4c20-a010-734a46f59e05" containerName="registry-server" Mar 16 00:12:22 crc kubenswrapper[4816]: I0316 00:12:22.553758 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="2f10e41c-e6db-4083-bb86-ed0d39cc1a5a" containerName="pruner" Mar 16 00:12:22 crc kubenswrapper[4816]: I0316 00:12:22.553769 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="e570fb38-3e4c-4b9b-82d9-878ec6a5306f" containerName="oc" Mar 16 00:12:22 crc kubenswrapper[4816]: I0316 00:12:22.553776 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="ff69863d-13e1-444c-ba61-6d68a509a203" containerName="registry-server" Mar 16 00:12:22 crc kubenswrapper[4816]: I0316 00:12:22.553791 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="9fc59286-0388-4519-afc7-f2c8cf80ab40" containerName="image-pruner" Mar 16 00:12:22 crc kubenswrapper[4816]: I0316 00:12:22.553798 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="bf586c6e-f957-46fc-8140-f9a9ea22510f" containerName="registry-server" Mar 16 00:12:22 crc kubenswrapper[4816]: I0316 00:12:22.554201 4816 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Mar 16 00:12:22 crc kubenswrapper[4816]: I0316 00:12:22.554309 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 16 00:12:22 crc kubenswrapper[4816]: I0316 00:12:22.554507 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" containerID="cri-o://46db23088e8ba791d736f70a65bd066af80a5ec27c31332d9ac9c52530b21bfd" gracePeriod=15 Mar 16 00:12:22 crc kubenswrapper[4816]: I0316 00:12:22.554586 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" containerID="cri-o://c81d0c40be9c3912864280cd98481f837ef29f69d85599b39decf5e9d7a97eff" gracePeriod=15 Mar 16 00:12:22 crc kubenswrapper[4816]: I0316 00:12:22.554670 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" containerID="cri-o://da7b257af12b9ee605dc32a26d9565f866a9cafebb4936a0bde7f42ae9909f80" gracePeriod=15 Mar 16 00:12:22 crc kubenswrapper[4816]: I0316 00:12:22.554734 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" containerID="cri-o://4e7f3b9b1fc1648908bbf44f9ad32e1eb510b68336a8b60c8417df2f622a36e5" gracePeriod=15 Mar 16 00:12:22 crc kubenswrapper[4816]: I0316 00:12:22.554936 4816 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Mar 16 00:12:22 crc kubenswrapper[4816]: E0316 00:12:22.555188 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Mar 16 00:12:22 crc kubenswrapper[4816]: I0316 00:12:22.555203 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Mar 16 00:12:22 crc kubenswrapper[4816]: E0316 00:12:22.555256 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 16 00:12:22 crc kubenswrapper[4816]: I0316 00:12:22.555267 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 16 00:12:22 crc kubenswrapper[4816]: E0316 00:12:22.555277 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 16 00:12:22 crc kubenswrapper[4816]: I0316 00:12:22.555284 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 16 00:12:22 crc kubenswrapper[4816]: E0316 00:12:22.555296 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Mar 16 00:12:22 crc kubenswrapper[4816]: I0316 00:12:22.555303 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Mar 16 00:12:22 crc kubenswrapper[4816]: E0316 00:12:22.555312 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 16 00:12:22 crc kubenswrapper[4816]: I0316 00:12:22.555319 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 16 00:12:22 crc kubenswrapper[4816]: E0316 00:12:22.555329 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Mar 16 00:12:22 crc kubenswrapper[4816]: I0316 00:12:22.555335 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Mar 16 00:12:22 crc kubenswrapper[4816]: E0316 00:12:22.555342 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Mar 16 00:12:22 crc kubenswrapper[4816]: I0316 00:12:22.555350 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Mar 16 00:12:22 crc kubenswrapper[4816]: E0316 00:12:22.555369 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Mar 16 00:12:22 crc kubenswrapper[4816]: I0316 00:12:22.555376 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Mar 16 00:12:22 crc kubenswrapper[4816]: I0316 00:12:22.555481 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 16 00:12:22 crc kubenswrapper[4816]: I0316 00:12:22.555493 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 16 00:12:22 crc kubenswrapper[4816]: I0316 00:12:22.555501 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Mar 16 00:12:22 crc kubenswrapper[4816]: I0316 00:12:22.555516 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 16 00:12:22 crc kubenswrapper[4816]: I0316 00:12:22.555527 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Mar 16 00:12:22 crc kubenswrapper[4816]: I0316 00:12:22.555536 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Mar 16 00:12:22 crc kubenswrapper[4816]: I0316 00:12:22.555562 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Mar 16 00:12:22 crc kubenswrapper[4816]: E0316 00:12:22.555671 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 16 00:12:22 crc kubenswrapper[4816]: I0316 00:12:22.555681 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 16 00:12:22 crc kubenswrapper[4816]: E0316 00:12:22.555692 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 16 00:12:22 crc kubenswrapper[4816]: I0316 00:12:22.555699 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 16 00:12:22 crc kubenswrapper[4816]: I0316 00:12:22.555796 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 16 00:12:22 crc kubenswrapper[4816]: I0316 00:12:22.555810 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 16 00:12:22 crc kubenswrapper[4816]: I0316 00:12:22.554761 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" containerID="cri-o://c2523ea43f2490afb81354be8a2840f54a3be1a8d3154196e0c8ac365d09723a" gracePeriod=15 Mar 16 00:12:22 crc kubenswrapper[4816]: I0316 00:12:22.598761 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Mar 16 00:12:22 crc kubenswrapper[4816]: I0316 00:12:22.678293 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 16 00:12:22 crc kubenswrapper[4816]: I0316 00:12:22.678346 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 16 00:12:22 crc kubenswrapper[4816]: I0316 00:12:22.678415 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 16 00:12:22 crc kubenswrapper[4816]: I0316 00:12:22.678438 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 16 00:12:22 crc kubenswrapper[4816]: I0316 00:12:22.678466 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 16 00:12:22 crc kubenswrapper[4816]: I0316 00:12:22.678557 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 16 00:12:22 crc kubenswrapper[4816]: I0316 00:12:22.678591 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 16 00:12:22 crc kubenswrapper[4816]: I0316 00:12:22.678620 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 16 00:12:22 crc kubenswrapper[4816]: I0316 00:12:22.779533 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 16 00:12:22 crc kubenswrapper[4816]: I0316 00:12:22.779946 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 16 00:12:22 crc kubenswrapper[4816]: I0316 00:12:22.780089 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 16 00:12:22 crc kubenswrapper[4816]: I0316 00:12:22.780151 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 16 00:12:22 crc kubenswrapper[4816]: I0316 00:12:22.780186 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 16 00:12:22 crc kubenswrapper[4816]: I0316 00:12:22.780218 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 16 00:12:22 crc kubenswrapper[4816]: I0316 00:12:22.780258 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 16 00:12:22 crc kubenswrapper[4816]: I0316 00:12:22.780294 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 16 00:12:22 crc kubenswrapper[4816]: I0316 00:12:22.780387 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 16 00:12:22 crc kubenswrapper[4816]: I0316 00:12:22.780435 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 16 00:12:22 crc kubenswrapper[4816]: I0316 00:12:22.780464 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 16 00:12:22 crc kubenswrapper[4816]: I0316 00:12:22.780491 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 16 00:12:22 crc kubenswrapper[4816]: I0316 00:12:22.780517 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 16 00:12:22 crc kubenswrapper[4816]: I0316 00:12:22.780567 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 16 00:12:22 crc kubenswrapper[4816]: I0316 00:12:22.780602 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 16 00:12:22 crc kubenswrapper[4816]: I0316 00:12:22.780631 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 16 00:12:22 crc kubenswrapper[4816]: I0316 00:12:22.888297 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 16 00:12:22 crc kubenswrapper[4816]: W0316 00:12:22.908685 4816 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf85e55b1a89d02b0cb034b1ea31ed45a.slice/crio-64dc539850c7ea7966762da40da0e3a6db88dd12101785edd9669d7fb3f9a68a WatchSource:0}: Error finding container 64dc539850c7ea7966762da40da0e3a6db88dd12101785edd9669d7fb3f9a68a: Status 404 returned error can't find the container with id 64dc539850c7ea7966762da40da0e3a6db88dd12101785edd9669d7fb3f9a68a Mar 16 00:12:22 crc kubenswrapper[4816]: E0316 00:12:22.912538 4816 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.158:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.189d29f0e7c5f931 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-16 00:12:22.911244593 +0000 UTC m=+336.007544546,LastTimestamp:2026-03-16 00:12:22.911244593 +0000 UTC m=+336.007544546,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 16 00:12:23 crc kubenswrapper[4816]: I0316 00:12:23.513990 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Mar 16 00:12:23 crc kubenswrapper[4816]: I0316 00:12:23.515370 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Mar 16 00:12:23 crc kubenswrapper[4816]: I0316 00:12:23.516057 4816 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="4e7f3b9b1fc1648908bbf44f9ad32e1eb510b68336a8b60c8417df2f622a36e5" exitCode=0 Mar 16 00:12:23 crc kubenswrapper[4816]: I0316 00:12:23.516082 4816 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="c81d0c40be9c3912864280cd98481f837ef29f69d85599b39decf5e9d7a97eff" exitCode=0 Mar 16 00:12:23 crc kubenswrapper[4816]: I0316 00:12:23.516091 4816 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="da7b257af12b9ee605dc32a26d9565f866a9cafebb4936a0bde7f42ae9909f80" exitCode=0 Mar 16 00:12:23 crc kubenswrapper[4816]: I0316 00:12:23.516098 4816 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="c2523ea43f2490afb81354be8a2840f54a3be1a8d3154196e0c8ac365d09723a" exitCode=2 Mar 16 00:12:23 crc kubenswrapper[4816]: I0316 00:12:23.516156 4816 scope.go:117] "RemoveContainer" containerID="29c4319e41bd025417b11cb87b682a66698ba7575801f247b1192dc8067ab66f" Mar 16 00:12:23 crc kubenswrapper[4816]: I0316 00:12:23.517862 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"eed2d54a8999c489e0b768ad5f7c5739ae6d65acf1bf2efc0c5987b2cb49fba4"} Mar 16 00:12:23 crc kubenswrapper[4816]: I0316 00:12:23.517975 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"64dc539850c7ea7966762da40da0e3a6db88dd12101785edd9669d7fb3f9a68a"} Mar 16 00:12:23 crc kubenswrapper[4816]: I0316 00:12:23.519181 4816 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.158:6443: connect: connection refused" Mar 16 00:12:23 crc kubenswrapper[4816]: I0316 00:12:23.519640 4816 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.158:6443: connect: connection refused" Mar 16 00:12:23 crc kubenswrapper[4816]: I0316 00:12:23.520893 4816 generic.go:334] "Generic (PLEG): container finished" podID="a9416716-e666-46d6-9d77-fe5c9702c035" containerID="dd1bd79082698289c067776233601bb17f6ba8cbb98dcb745b3329e5f4f6fb1f" exitCode=0 Mar 16 00:12:23 crc kubenswrapper[4816]: I0316 00:12:23.521033 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"a9416716-e666-46d6-9d77-fe5c9702c035","Type":"ContainerDied","Data":"dd1bd79082698289c067776233601bb17f6ba8cbb98dcb745b3329e5f4f6fb1f"} Mar 16 00:12:23 crc kubenswrapper[4816]: I0316 00:12:23.521595 4816 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.158:6443: connect: connection refused" Mar 16 00:12:23 crc kubenswrapper[4816]: I0316 00:12:23.521915 4816 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.158:6443: connect: connection refused" Mar 16 00:12:23 crc kubenswrapper[4816]: I0316 00:12:23.522136 4816 status_manager.go:851] "Failed to get status for pod" podUID="a9416716-e666-46d6-9d77-fe5c9702c035" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.158:6443: connect: connection refused" Mar 16 00:12:23 crc kubenswrapper[4816]: I0316 00:12:23.677730 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bad7b5f7-88a8-4c20-a010-734a46f59e05" path="/var/lib/kubelet/pods/bad7b5f7-88a8-4c20-a010-734a46f59e05/volumes" Mar 16 00:12:23 crc kubenswrapper[4816]: E0316 00:12:23.752713 4816 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.158:6443: connect: connection refused" Mar 16 00:12:23 crc kubenswrapper[4816]: E0316 00:12:23.753205 4816 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.158:6443: connect: connection refused" Mar 16 00:12:23 crc kubenswrapper[4816]: E0316 00:12:23.753448 4816 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.158:6443: connect: connection refused" Mar 16 00:12:23 crc kubenswrapper[4816]: E0316 00:12:23.753690 4816 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.158:6443: connect: connection refused" Mar 16 00:12:23 crc kubenswrapper[4816]: E0316 00:12:23.754036 4816 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.158:6443: connect: connection refused" Mar 16 00:12:23 crc kubenswrapper[4816]: I0316 00:12:23.754136 4816 controller.go:115] "failed to update lease using latest lease, fallback to ensure lease" err="failed 5 attempts to update lease" Mar 16 00:12:23 crc kubenswrapper[4816]: E0316 00:12:23.754506 4816 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.158:6443: connect: connection refused" interval="200ms" Mar 16 00:12:23 crc kubenswrapper[4816]: E0316 00:12:23.954929 4816 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.158:6443: connect: connection refused" interval="400ms" Mar 16 00:12:24 crc kubenswrapper[4816]: E0316 00:12:24.356089 4816 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.158:6443: connect: connection refused" interval="800ms" Mar 16 00:12:24 crc kubenswrapper[4816]: I0316 00:12:24.529484 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-source-55646444c4-trplf_9d751cbb-f2e2-430d-9754-c882a5e924a5/check-endpoints/0.log" Mar 16 00:12:24 crc kubenswrapper[4816]: I0316 00:12:24.529573 4816 generic.go:334] "Generic (PLEG): container finished" podID="9d751cbb-f2e2-430d-9754-c882a5e924a5" containerID="3ea1184f822884ab69fdfb7a5fe5136882ffc21bf737db72c6911e5ee012e8d7" exitCode=255 Mar 16 00:12:24 crc kubenswrapper[4816]: I0316 00:12:24.529676 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerDied","Data":"3ea1184f822884ab69fdfb7a5fe5136882ffc21bf737db72c6911e5ee012e8d7"} Mar 16 00:12:24 crc kubenswrapper[4816]: I0316 00:12:24.530258 4816 scope.go:117] "RemoveContainer" containerID="3ea1184f822884ab69fdfb7a5fe5136882ffc21bf737db72c6911e5ee012e8d7" Mar 16 00:12:24 crc kubenswrapper[4816]: I0316 00:12:24.530301 4816 status_manager.go:851] "Failed to get status for pod" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-network-diagnostics/pods/network-check-source-55646444c4-trplf\": dial tcp 38.102.83.158:6443: connect: connection refused" Mar 16 00:12:24 crc kubenswrapper[4816]: I0316 00:12:24.530625 4816 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.158:6443: connect: connection refused" Mar 16 00:12:24 crc kubenswrapper[4816]: I0316 00:12:24.531371 4816 status_manager.go:851] "Failed to get status for pod" podUID="a9416716-e666-46d6-9d77-fe5c9702c035" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.158:6443: connect: connection refused" Mar 16 00:12:24 crc kubenswrapper[4816]: I0316 00:12:24.533604 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Mar 16 00:12:25 crc kubenswrapper[4816]: I0316 00:12:25.078416 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Mar 16 00:12:25 crc kubenswrapper[4816]: I0316 00:12:25.079622 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 16 00:12:25 crc kubenswrapper[4816]: I0316 00:12:25.080271 4816 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.158:6443: connect: connection refused" Mar 16 00:12:25 crc kubenswrapper[4816]: I0316 00:12:25.080605 4816 status_manager.go:851] "Failed to get status for pod" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-network-diagnostics/pods/network-check-source-55646444c4-trplf\": dial tcp 38.102.83.158:6443: connect: connection refused" Mar 16 00:12:25 crc kubenswrapper[4816]: I0316 00:12:25.080828 4816 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.158:6443: connect: connection refused" Mar 16 00:12:25 crc kubenswrapper[4816]: I0316 00:12:25.081035 4816 status_manager.go:851] "Failed to get status for pod" podUID="a9416716-e666-46d6-9d77-fe5c9702c035" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.158:6443: connect: connection refused" Mar 16 00:12:25 crc kubenswrapper[4816]: I0316 00:12:25.082590 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Mar 16 00:12:25 crc kubenswrapper[4816]: I0316 00:12:25.083228 4816 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.158:6443: connect: connection refused" Mar 16 00:12:25 crc kubenswrapper[4816]: I0316 00:12:25.083538 4816 status_manager.go:851] "Failed to get status for pod" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-network-diagnostics/pods/network-check-source-55646444c4-trplf\": dial tcp 38.102.83.158:6443: connect: connection refused" Mar 16 00:12:25 crc kubenswrapper[4816]: I0316 00:12:25.083892 4816 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.158:6443: connect: connection refused" Mar 16 00:12:25 crc kubenswrapper[4816]: I0316 00:12:25.084202 4816 status_manager.go:851] "Failed to get status for pod" podUID="a9416716-e666-46d6-9d77-fe5c9702c035" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.158:6443: connect: connection refused" Mar 16 00:12:25 crc kubenswrapper[4816]: E0316 00:12:25.157359 4816 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.158:6443: connect: connection refused" interval="1.6s" Mar 16 00:12:25 crc kubenswrapper[4816]: I0316 00:12:25.168940 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Mar 16 00:12:25 crc kubenswrapper[4816]: I0316 00:12:25.169007 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Mar 16 00:12:25 crc kubenswrapper[4816]: I0316 00:12:25.169038 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/a9416716-e666-46d6-9d77-fe5c9702c035-var-lock\") pod \"a9416716-e666-46d6-9d77-fe5c9702c035\" (UID: \"a9416716-e666-46d6-9d77-fe5c9702c035\") " Mar 16 00:12:25 crc kubenswrapper[4816]: I0316 00:12:25.169095 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/a9416716-e666-46d6-9d77-fe5c9702c035-kubelet-dir\") pod \"a9416716-e666-46d6-9d77-fe5c9702c035\" (UID: \"a9416716-e666-46d6-9d77-fe5c9702c035\") " Mar 16 00:12:25 crc kubenswrapper[4816]: I0316 00:12:25.169158 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a9416716-e666-46d6-9d77-fe5c9702c035-kube-api-access\") pod \"a9416716-e666-46d6-9d77-fe5c9702c035\" (UID: \"a9416716-e666-46d6-9d77-fe5c9702c035\") " Mar 16 00:12:25 crc kubenswrapper[4816]: I0316 00:12:25.169089 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 16 00:12:25 crc kubenswrapper[4816]: I0316 00:12:25.169084 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 16 00:12:25 crc kubenswrapper[4816]: I0316 00:12:25.169117 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a9416716-e666-46d6-9d77-fe5c9702c035-var-lock" (OuterVolumeSpecName: "var-lock") pod "a9416716-e666-46d6-9d77-fe5c9702c035" (UID: "a9416716-e666-46d6-9d77-fe5c9702c035"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 16 00:12:25 crc kubenswrapper[4816]: I0316 00:12:25.169125 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a9416716-e666-46d6-9d77-fe5c9702c035-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "a9416716-e666-46d6-9d77-fe5c9702c035" (UID: "a9416716-e666-46d6-9d77-fe5c9702c035"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 16 00:12:25 crc kubenswrapper[4816]: I0316 00:12:25.169264 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Mar 16 00:12:25 crc kubenswrapper[4816]: I0316 00:12:25.169349 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir" (OuterVolumeSpecName: "cert-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "cert-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 16 00:12:25 crc kubenswrapper[4816]: I0316 00:12:25.169579 4816 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") on node \"crc\" DevicePath \"\"" Mar 16 00:12:25 crc kubenswrapper[4816]: I0316 00:12:25.169601 4816 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") on node \"crc\" DevicePath \"\"" Mar 16 00:12:25 crc kubenswrapper[4816]: I0316 00:12:25.169629 4816 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/a9416716-e666-46d6-9d77-fe5c9702c035-var-lock\") on node \"crc\" DevicePath \"\"" Mar 16 00:12:25 crc kubenswrapper[4816]: I0316 00:12:25.169640 4816 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/a9416716-e666-46d6-9d77-fe5c9702c035-kubelet-dir\") on node \"crc\" DevicePath \"\"" Mar 16 00:12:25 crc kubenswrapper[4816]: I0316 00:12:25.169648 4816 reconciler_common.go:293] "Volume detached for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") on node \"crc\" DevicePath \"\"" Mar 16 00:12:25 crc kubenswrapper[4816]: I0316 00:12:25.178655 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a9416716-e666-46d6-9d77-fe5c9702c035-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "a9416716-e666-46d6-9d77-fe5c9702c035" (UID: "a9416716-e666-46d6-9d77-fe5c9702c035"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:12:25 crc kubenswrapper[4816]: I0316 00:12:25.271108 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a9416716-e666-46d6-9d77-fe5c9702c035-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 16 00:12:25 crc kubenswrapper[4816]: I0316 00:12:25.542934 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"a9416716-e666-46d6-9d77-fe5c9702c035","Type":"ContainerDied","Data":"dd148857a3f8f3853eb8381f642acb80c9aad6dc4ab5491e0ecfe89f172f60d6"} Mar 16 00:12:25 crc kubenswrapper[4816]: I0316 00:12:25.542974 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Mar 16 00:12:25 crc kubenswrapper[4816]: I0316 00:12:25.543003 4816 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="dd148857a3f8f3853eb8381f642acb80c9aad6dc4ab5491e0ecfe89f172f60d6" Mar 16 00:12:25 crc kubenswrapper[4816]: I0316 00:12:25.546391 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Mar 16 00:12:25 crc kubenswrapper[4816]: I0316 00:12:25.547288 4816 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="46db23088e8ba791d736f70a65bd066af80a5ec27c31332d9ac9c52530b21bfd" exitCode=0 Mar 16 00:12:25 crc kubenswrapper[4816]: I0316 00:12:25.547410 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 16 00:12:25 crc kubenswrapper[4816]: I0316 00:12:25.547410 4816 scope.go:117] "RemoveContainer" containerID="4e7f3b9b1fc1648908bbf44f9ad32e1eb510b68336a8b60c8417df2f622a36e5" Mar 16 00:12:25 crc kubenswrapper[4816]: I0316 00:12:25.549946 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-source-55646444c4-trplf_9d751cbb-f2e2-430d-9754-c882a5e924a5/check-endpoints/1.log" Mar 16 00:12:25 crc kubenswrapper[4816]: I0316 00:12:25.551335 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-source-55646444c4-trplf_9d751cbb-f2e2-430d-9754-c882a5e924a5/check-endpoints/0.log" Mar 16 00:12:25 crc kubenswrapper[4816]: I0316 00:12:25.551391 4816 generic.go:334] "Generic (PLEG): container finished" podID="9d751cbb-f2e2-430d-9754-c882a5e924a5" containerID="486dacbeaae2de35a560c3abe670d11c1b0aff52bf6fcfee4e790d8493ac9b11" exitCode=255 Mar 16 00:12:25 crc kubenswrapper[4816]: I0316 00:12:25.551436 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerDied","Data":"486dacbeaae2de35a560c3abe670d11c1b0aff52bf6fcfee4e790d8493ac9b11"} Mar 16 00:12:25 crc kubenswrapper[4816]: I0316 00:12:25.552491 4816 scope.go:117] "RemoveContainer" containerID="486dacbeaae2de35a560c3abe670d11c1b0aff52bf6fcfee4e790d8493ac9b11" Mar 16 00:12:25 crc kubenswrapper[4816]: I0316 00:12:25.552647 4816 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.158:6443: connect: connection refused" Mar 16 00:12:25 crc kubenswrapper[4816]: E0316 00:12:25.553036 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=check-endpoints pod=network-check-source-55646444c4-trplf_openshift-network-diagnostics(9d751cbb-f2e2-430d-9754-c882a5e924a5)\"" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 16 00:12:25 crc kubenswrapper[4816]: I0316 00:12:25.553103 4816 status_manager.go:851] "Failed to get status for pod" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-network-diagnostics/pods/network-check-source-55646444c4-trplf\": dial tcp 38.102.83.158:6443: connect: connection refused" Mar 16 00:12:25 crc kubenswrapper[4816]: I0316 00:12:25.553391 4816 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.158:6443: connect: connection refused" Mar 16 00:12:25 crc kubenswrapper[4816]: I0316 00:12:25.553652 4816 status_manager.go:851] "Failed to get status for pod" podUID="a9416716-e666-46d6-9d77-fe5c9702c035" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.158:6443: connect: connection refused" Mar 16 00:12:25 crc kubenswrapper[4816]: I0316 00:12:25.596801 4816 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.158:6443: connect: connection refused" Mar 16 00:12:25 crc kubenswrapper[4816]: I0316 00:12:25.597224 4816 status_manager.go:851] "Failed to get status for pod" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-network-diagnostics/pods/network-check-source-55646444c4-trplf\": dial tcp 38.102.83.158:6443: connect: connection refused" Mar 16 00:12:25 crc kubenswrapper[4816]: I0316 00:12:25.597807 4816 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.158:6443: connect: connection refused" Mar 16 00:12:25 crc kubenswrapper[4816]: I0316 00:12:25.599014 4816 status_manager.go:851] "Failed to get status for pod" podUID="a9416716-e666-46d6-9d77-fe5c9702c035" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.158:6443: connect: connection refused" Mar 16 00:12:25 crc kubenswrapper[4816]: I0316 00:12:25.599753 4816 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.158:6443: connect: connection refused" Mar 16 00:12:25 crc kubenswrapper[4816]: I0316 00:12:25.600294 4816 status_manager.go:851] "Failed to get status for pod" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-network-diagnostics/pods/network-check-source-55646444c4-trplf\": dial tcp 38.102.83.158:6443: connect: connection refused" Mar 16 00:12:25 crc kubenswrapper[4816]: I0316 00:12:25.600702 4816 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.158:6443: connect: connection refused" Mar 16 00:12:25 crc kubenswrapper[4816]: I0316 00:12:25.601311 4816 status_manager.go:851] "Failed to get status for pod" podUID="a9416716-e666-46d6-9d77-fe5c9702c035" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.158:6443: connect: connection refused" Mar 16 00:12:25 crc kubenswrapper[4816]: I0316 00:12:25.606885 4816 scope.go:117] "RemoveContainer" containerID="c81d0c40be9c3912864280cd98481f837ef29f69d85599b39decf5e9d7a97eff" Mar 16 00:12:25 crc kubenswrapper[4816]: I0316 00:12:25.621779 4816 scope.go:117] "RemoveContainer" containerID="da7b257af12b9ee605dc32a26d9565f866a9cafebb4936a0bde7f42ae9909f80" Mar 16 00:12:25 crc kubenswrapper[4816]: I0316 00:12:25.650635 4816 scope.go:117] "RemoveContainer" containerID="c2523ea43f2490afb81354be8a2840f54a3be1a8d3154196e0c8ac365d09723a" Mar 16 00:12:25 crc kubenswrapper[4816]: I0316 00:12:25.665812 4816 scope.go:117] "RemoveContainer" containerID="46db23088e8ba791d736f70a65bd066af80a5ec27c31332d9ac9c52530b21bfd" Mar 16 00:12:25 crc kubenswrapper[4816]: I0316 00:12:25.675352 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f4b27818a5e8e43d0dc095d08835c792" path="/var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/volumes" Mar 16 00:12:25 crc kubenswrapper[4816]: I0316 00:12:25.688533 4816 scope.go:117] "RemoveContainer" containerID="0e39d816de31fa1c42a254ec61527496e7e85121f67007a774524167b1063cf1" Mar 16 00:12:25 crc kubenswrapper[4816]: I0316 00:12:25.711458 4816 scope.go:117] "RemoveContainer" containerID="4e7f3b9b1fc1648908bbf44f9ad32e1eb510b68336a8b60c8417df2f622a36e5" Mar 16 00:12:25 crc kubenswrapper[4816]: E0316 00:12:25.712235 4816 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4e7f3b9b1fc1648908bbf44f9ad32e1eb510b68336a8b60c8417df2f622a36e5\": container with ID starting with 4e7f3b9b1fc1648908bbf44f9ad32e1eb510b68336a8b60c8417df2f622a36e5 not found: ID does not exist" containerID="4e7f3b9b1fc1648908bbf44f9ad32e1eb510b68336a8b60c8417df2f622a36e5" Mar 16 00:12:25 crc kubenswrapper[4816]: I0316 00:12:25.712285 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4e7f3b9b1fc1648908bbf44f9ad32e1eb510b68336a8b60c8417df2f622a36e5"} err="failed to get container status \"4e7f3b9b1fc1648908bbf44f9ad32e1eb510b68336a8b60c8417df2f622a36e5\": rpc error: code = NotFound desc = could not find container \"4e7f3b9b1fc1648908bbf44f9ad32e1eb510b68336a8b60c8417df2f622a36e5\": container with ID starting with 4e7f3b9b1fc1648908bbf44f9ad32e1eb510b68336a8b60c8417df2f622a36e5 not found: ID does not exist" Mar 16 00:12:25 crc kubenswrapper[4816]: I0316 00:12:25.712315 4816 scope.go:117] "RemoveContainer" containerID="c81d0c40be9c3912864280cd98481f837ef29f69d85599b39decf5e9d7a97eff" Mar 16 00:12:25 crc kubenswrapper[4816]: E0316 00:12:25.712967 4816 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c81d0c40be9c3912864280cd98481f837ef29f69d85599b39decf5e9d7a97eff\": container with ID starting with c81d0c40be9c3912864280cd98481f837ef29f69d85599b39decf5e9d7a97eff not found: ID does not exist" containerID="c81d0c40be9c3912864280cd98481f837ef29f69d85599b39decf5e9d7a97eff" Mar 16 00:12:25 crc kubenswrapper[4816]: I0316 00:12:25.713006 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c81d0c40be9c3912864280cd98481f837ef29f69d85599b39decf5e9d7a97eff"} err="failed to get container status \"c81d0c40be9c3912864280cd98481f837ef29f69d85599b39decf5e9d7a97eff\": rpc error: code = NotFound desc = could not find container \"c81d0c40be9c3912864280cd98481f837ef29f69d85599b39decf5e9d7a97eff\": container with ID starting with c81d0c40be9c3912864280cd98481f837ef29f69d85599b39decf5e9d7a97eff not found: ID does not exist" Mar 16 00:12:25 crc kubenswrapper[4816]: I0316 00:12:25.713036 4816 scope.go:117] "RemoveContainer" containerID="da7b257af12b9ee605dc32a26d9565f866a9cafebb4936a0bde7f42ae9909f80" Mar 16 00:12:25 crc kubenswrapper[4816]: E0316 00:12:25.713312 4816 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"da7b257af12b9ee605dc32a26d9565f866a9cafebb4936a0bde7f42ae9909f80\": container with ID starting with da7b257af12b9ee605dc32a26d9565f866a9cafebb4936a0bde7f42ae9909f80 not found: ID does not exist" containerID="da7b257af12b9ee605dc32a26d9565f866a9cafebb4936a0bde7f42ae9909f80" Mar 16 00:12:25 crc kubenswrapper[4816]: I0316 00:12:25.713331 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"da7b257af12b9ee605dc32a26d9565f866a9cafebb4936a0bde7f42ae9909f80"} err="failed to get container status \"da7b257af12b9ee605dc32a26d9565f866a9cafebb4936a0bde7f42ae9909f80\": rpc error: code = NotFound desc = could not find container \"da7b257af12b9ee605dc32a26d9565f866a9cafebb4936a0bde7f42ae9909f80\": container with ID starting with da7b257af12b9ee605dc32a26d9565f866a9cafebb4936a0bde7f42ae9909f80 not found: ID does not exist" Mar 16 00:12:25 crc kubenswrapper[4816]: I0316 00:12:25.713343 4816 scope.go:117] "RemoveContainer" containerID="c2523ea43f2490afb81354be8a2840f54a3be1a8d3154196e0c8ac365d09723a" Mar 16 00:12:25 crc kubenswrapper[4816]: E0316 00:12:25.714100 4816 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c2523ea43f2490afb81354be8a2840f54a3be1a8d3154196e0c8ac365d09723a\": container with ID starting with c2523ea43f2490afb81354be8a2840f54a3be1a8d3154196e0c8ac365d09723a not found: ID does not exist" containerID="c2523ea43f2490afb81354be8a2840f54a3be1a8d3154196e0c8ac365d09723a" Mar 16 00:12:25 crc kubenswrapper[4816]: I0316 00:12:25.714121 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c2523ea43f2490afb81354be8a2840f54a3be1a8d3154196e0c8ac365d09723a"} err="failed to get container status \"c2523ea43f2490afb81354be8a2840f54a3be1a8d3154196e0c8ac365d09723a\": rpc error: code = NotFound desc = could not find container \"c2523ea43f2490afb81354be8a2840f54a3be1a8d3154196e0c8ac365d09723a\": container with ID starting with c2523ea43f2490afb81354be8a2840f54a3be1a8d3154196e0c8ac365d09723a not found: ID does not exist" Mar 16 00:12:25 crc kubenswrapper[4816]: I0316 00:12:25.714134 4816 scope.go:117] "RemoveContainer" containerID="46db23088e8ba791d736f70a65bd066af80a5ec27c31332d9ac9c52530b21bfd" Mar 16 00:12:25 crc kubenswrapper[4816]: E0316 00:12:25.714336 4816 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"46db23088e8ba791d736f70a65bd066af80a5ec27c31332d9ac9c52530b21bfd\": container with ID starting with 46db23088e8ba791d736f70a65bd066af80a5ec27c31332d9ac9c52530b21bfd not found: ID does not exist" containerID="46db23088e8ba791d736f70a65bd066af80a5ec27c31332d9ac9c52530b21bfd" Mar 16 00:12:25 crc kubenswrapper[4816]: I0316 00:12:25.714354 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"46db23088e8ba791d736f70a65bd066af80a5ec27c31332d9ac9c52530b21bfd"} err="failed to get container status \"46db23088e8ba791d736f70a65bd066af80a5ec27c31332d9ac9c52530b21bfd\": rpc error: code = NotFound desc = could not find container \"46db23088e8ba791d736f70a65bd066af80a5ec27c31332d9ac9c52530b21bfd\": container with ID starting with 46db23088e8ba791d736f70a65bd066af80a5ec27c31332d9ac9c52530b21bfd not found: ID does not exist" Mar 16 00:12:25 crc kubenswrapper[4816]: I0316 00:12:25.714372 4816 scope.go:117] "RemoveContainer" containerID="0e39d816de31fa1c42a254ec61527496e7e85121f67007a774524167b1063cf1" Mar 16 00:12:25 crc kubenswrapper[4816]: E0316 00:12:25.714587 4816 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0e39d816de31fa1c42a254ec61527496e7e85121f67007a774524167b1063cf1\": container with ID starting with 0e39d816de31fa1c42a254ec61527496e7e85121f67007a774524167b1063cf1 not found: ID does not exist" containerID="0e39d816de31fa1c42a254ec61527496e7e85121f67007a774524167b1063cf1" Mar 16 00:12:25 crc kubenswrapper[4816]: I0316 00:12:25.714608 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0e39d816de31fa1c42a254ec61527496e7e85121f67007a774524167b1063cf1"} err="failed to get container status \"0e39d816de31fa1c42a254ec61527496e7e85121f67007a774524167b1063cf1\": rpc error: code = NotFound desc = could not find container \"0e39d816de31fa1c42a254ec61527496e7e85121f67007a774524167b1063cf1\": container with ID starting with 0e39d816de31fa1c42a254ec61527496e7e85121f67007a774524167b1063cf1 not found: ID does not exist" Mar 16 00:12:25 crc kubenswrapper[4816]: I0316 00:12:25.714621 4816 scope.go:117] "RemoveContainer" containerID="3ea1184f822884ab69fdfb7a5fe5136882ffc21bf737db72c6911e5ee012e8d7" Mar 16 00:12:26 crc kubenswrapper[4816]: I0316 00:12:26.511755 4816 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-hvpqn" Mar 16 00:12:26 crc kubenswrapper[4816]: I0316 00:12:26.512502 4816 status_manager.go:851] "Failed to get status for pod" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-network-diagnostics/pods/network-check-source-55646444c4-trplf\": dial tcp 38.102.83.158:6443: connect: connection refused" Mar 16 00:12:26 crc kubenswrapper[4816]: I0316 00:12:26.512895 4816 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.158:6443: connect: connection refused" Mar 16 00:12:26 crc kubenswrapper[4816]: I0316 00:12:26.513076 4816 status_manager.go:851] "Failed to get status for pod" podUID="cc1ea93d-1cf8-4145-ad35-83f2d1357f9d" pod="openshift-marketplace/redhat-operators-hvpqn" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-hvpqn\": dial tcp 38.102.83.158:6443: connect: connection refused" Mar 16 00:12:26 crc kubenswrapper[4816]: I0316 00:12:26.513389 4816 status_manager.go:851] "Failed to get status for pod" podUID="a9416716-e666-46d6-9d77-fe5c9702c035" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.158:6443: connect: connection refused" Mar 16 00:12:26 crc kubenswrapper[4816]: I0316 00:12:26.550318 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-hvpqn" Mar 16 00:12:26 crc kubenswrapper[4816]: I0316 00:12:26.550909 4816 status_manager.go:851] "Failed to get status for pod" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-network-diagnostics/pods/network-check-source-55646444c4-trplf\": dial tcp 38.102.83.158:6443: connect: connection refused" Mar 16 00:12:26 crc kubenswrapper[4816]: I0316 00:12:26.551314 4816 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.158:6443: connect: connection refused" Mar 16 00:12:26 crc kubenswrapper[4816]: I0316 00:12:26.551670 4816 status_manager.go:851] "Failed to get status for pod" podUID="cc1ea93d-1cf8-4145-ad35-83f2d1357f9d" pod="openshift-marketplace/redhat-operators-hvpqn" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-hvpqn\": dial tcp 38.102.83.158:6443: connect: connection refused" Mar 16 00:12:26 crc kubenswrapper[4816]: I0316 00:12:26.552074 4816 status_manager.go:851] "Failed to get status for pod" podUID="a9416716-e666-46d6-9d77-fe5c9702c035" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.158:6443: connect: connection refused" Mar 16 00:12:26 crc kubenswrapper[4816]: I0316 00:12:26.556540 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-source-55646444c4-trplf_9d751cbb-f2e2-430d-9754-c882a5e924a5/check-endpoints/1.log" Mar 16 00:12:26 crc kubenswrapper[4816]: E0316 00:12:26.569574 4816 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.158:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.189d29f0e7c5f931 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-16 00:12:22.911244593 +0000 UTC m=+336.007544546,LastTimestamp:2026-03-16 00:12:22.911244593 +0000 UTC m=+336.007544546,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 16 00:12:26 crc kubenswrapper[4816]: E0316 00:12:26.758864 4816 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.158:6443: connect: connection refused" interval="3.2s" Mar 16 00:12:27 crc kubenswrapper[4816]: I0316 00:12:27.672112 4816 status_manager.go:851] "Failed to get status for pod" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-network-diagnostics/pods/network-check-source-55646444c4-trplf\": dial tcp 38.102.83.158:6443: connect: connection refused" Mar 16 00:12:27 crc kubenswrapper[4816]: I0316 00:12:27.672825 4816 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.158:6443: connect: connection refused" Mar 16 00:12:27 crc kubenswrapper[4816]: I0316 00:12:27.673069 4816 status_manager.go:851] "Failed to get status for pod" podUID="cc1ea93d-1cf8-4145-ad35-83f2d1357f9d" pod="openshift-marketplace/redhat-operators-hvpqn" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-hvpqn\": dial tcp 38.102.83.158:6443: connect: connection refused" Mar 16 00:12:27 crc kubenswrapper[4816]: I0316 00:12:27.673433 4816 status_manager.go:851] "Failed to get status for pod" podUID="a9416716-e666-46d6-9d77-fe5c9702c035" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.158:6443: connect: connection refused" Mar 16 00:12:27 crc kubenswrapper[4816]: E0316 00:12:27.715525 4816 desired_state_of_world_populator.go:312] "Error processing volume" err="error processing PVC openshift-image-registry/crc-image-registry-storage: failed to fetch PVC from API server: Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-image-registry/persistentvolumeclaims/crc-image-registry-storage\": dial tcp 38.102.83.158:6443: connect: connection refused" pod="openshift-image-registry/image-registry-697d97f7c8-ckvwn" volumeName="registry-storage" Mar 16 00:12:29 crc kubenswrapper[4816]: E0316 00:12:29.960233 4816 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.158:6443: connect: connection refused" interval="6.4s" Mar 16 00:12:35 crc kubenswrapper[4816]: I0316 00:12:35.612083 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/0.log" Mar 16 00:12:35 crc kubenswrapper[4816]: I0316 00:12:35.613028 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Mar 16 00:12:35 crc kubenswrapper[4816]: I0316 00:12:35.613075 4816 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="5f3dfc8f46079b51e52802920a734bf796a00db2cf42501b9f7e202e0e9bc2ac" exitCode=1 Mar 16 00:12:35 crc kubenswrapper[4816]: I0316 00:12:35.613120 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"5f3dfc8f46079b51e52802920a734bf796a00db2cf42501b9f7e202e0e9bc2ac"} Mar 16 00:12:35 crc kubenswrapper[4816]: I0316 00:12:35.613720 4816 scope.go:117] "RemoveContainer" containerID="5f3dfc8f46079b51e52802920a734bf796a00db2cf42501b9f7e202e0e9bc2ac" Mar 16 00:12:35 crc kubenswrapper[4816]: I0316 00:12:35.614885 4816 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.158:6443: connect: connection refused" Mar 16 00:12:35 crc kubenswrapper[4816]: I0316 00:12:35.615383 4816 status_manager.go:851] "Failed to get status for pod" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-network-diagnostics/pods/network-check-source-55646444c4-trplf\": dial tcp 38.102.83.158:6443: connect: connection refused" Mar 16 00:12:35 crc kubenswrapper[4816]: I0316 00:12:35.616529 4816 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.158:6443: connect: connection refused" Mar 16 00:12:35 crc kubenswrapper[4816]: I0316 00:12:35.617334 4816 status_manager.go:851] "Failed to get status for pod" podUID="cc1ea93d-1cf8-4145-ad35-83f2d1357f9d" pod="openshift-marketplace/redhat-operators-hvpqn" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-hvpqn\": dial tcp 38.102.83.158:6443: connect: connection refused" Mar 16 00:12:35 crc kubenswrapper[4816]: I0316 00:12:35.617771 4816 status_manager.go:851] "Failed to get status for pod" podUID="a9416716-e666-46d6-9d77-fe5c9702c035" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.158:6443: connect: connection refused" Mar 16 00:12:36 crc kubenswrapper[4816]: E0316 00:12:36.361251 4816 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.158:6443: connect: connection refused" interval="7s" Mar 16 00:12:36 crc kubenswrapper[4816]: E0316 00:12:36.571020 4816 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.158:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.189d29f0e7c5f931 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-16 00:12:22.911244593 +0000 UTC m=+336.007544546,LastTimestamp:2026-03-16 00:12:22.911244593 +0000 UTC m=+336.007544546,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 16 00:12:36 crc kubenswrapper[4816]: I0316 00:12:36.624123 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/0.log" Mar 16 00:12:36 crc kubenswrapper[4816]: I0316 00:12:36.624673 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Mar 16 00:12:36 crc kubenswrapper[4816]: I0316 00:12:36.624720 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"4aa569a30f818c0b18a06057a8cfa80679949921cd47150e4a19e7cda2ca413d"} Mar 16 00:12:36 crc kubenswrapper[4816]: I0316 00:12:36.626203 4816 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.158:6443: connect: connection refused" Mar 16 00:12:36 crc kubenswrapper[4816]: I0316 00:12:36.627241 4816 status_manager.go:851] "Failed to get status for pod" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-network-diagnostics/pods/network-check-source-55646444c4-trplf\": dial tcp 38.102.83.158:6443: connect: connection refused" Mar 16 00:12:36 crc kubenswrapper[4816]: I0316 00:12:36.627689 4816 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.158:6443: connect: connection refused" Mar 16 00:12:36 crc kubenswrapper[4816]: I0316 00:12:36.628167 4816 status_manager.go:851] "Failed to get status for pod" podUID="cc1ea93d-1cf8-4145-ad35-83f2d1357f9d" pod="openshift-marketplace/redhat-operators-hvpqn" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-hvpqn\": dial tcp 38.102.83.158:6443: connect: connection refused" Mar 16 00:12:36 crc kubenswrapper[4816]: I0316 00:12:36.628415 4816 status_manager.go:851] "Failed to get status for pod" podUID="a9416716-e666-46d6-9d77-fe5c9702c035" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.158:6443: connect: connection refused" Mar 16 00:12:36 crc kubenswrapper[4816]: I0316 00:12:36.666958 4816 scope.go:117] "RemoveContainer" containerID="486dacbeaae2de35a560c3abe670d11c1b0aff52bf6fcfee4e790d8493ac9b11" Mar 16 00:12:36 crc kubenswrapper[4816]: I0316 00:12:36.667426 4816 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.158:6443: connect: connection refused" Mar 16 00:12:36 crc kubenswrapper[4816]: I0316 00:12:36.667886 4816 status_manager.go:851] "Failed to get status for pod" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-network-diagnostics/pods/network-check-source-55646444c4-trplf\": dial tcp 38.102.83.158:6443: connect: connection refused" Mar 16 00:12:36 crc kubenswrapper[4816]: I0316 00:12:36.668323 4816 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.158:6443: connect: connection refused" Mar 16 00:12:36 crc kubenswrapper[4816]: I0316 00:12:36.668587 4816 status_manager.go:851] "Failed to get status for pod" podUID="cc1ea93d-1cf8-4145-ad35-83f2d1357f9d" pod="openshift-marketplace/redhat-operators-hvpqn" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-hvpqn\": dial tcp 38.102.83.158:6443: connect: connection refused" Mar 16 00:12:36 crc kubenswrapper[4816]: I0316 00:12:36.668847 4816 status_manager.go:851] "Failed to get status for pod" podUID="a9416716-e666-46d6-9d77-fe5c9702c035" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.158:6443: connect: connection refused" Mar 16 00:12:37 crc kubenswrapper[4816]: I0316 00:12:37.634509 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-source-55646444c4-trplf_9d751cbb-f2e2-430d-9754-c882a5e924a5/check-endpoints/2.log" Mar 16 00:12:37 crc kubenswrapper[4816]: I0316 00:12:37.635662 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-source-55646444c4-trplf_9d751cbb-f2e2-430d-9754-c882a5e924a5/check-endpoints/1.log" Mar 16 00:12:37 crc kubenswrapper[4816]: I0316 00:12:37.635715 4816 generic.go:334] "Generic (PLEG): container finished" podID="9d751cbb-f2e2-430d-9754-c882a5e924a5" containerID="bf90d2b66ba96bda100b2d7e60bddd86b4409c573e87857f65bb7785355097cf" exitCode=255 Mar 16 00:12:37 crc kubenswrapper[4816]: I0316 00:12:37.635748 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerDied","Data":"bf90d2b66ba96bda100b2d7e60bddd86b4409c573e87857f65bb7785355097cf"} Mar 16 00:12:37 crc kubenswrapper[4816]: I0316 00:12:37.635788 4816 scope.go:117] "RemoveContainer" containerID="486dacbeaae2de35a560c3abe670d11c1b0aff52bf6fcfee4e790d8493ac9b11" Mar 16 00:12:37 crc kubenswrapper[4816]: I0316 00:12:37.636267 4816 scope.go:117] "RemoveContainer" containerID="bf90d2b66ba96bda100b2d7e60bddd86b4409c573e87857f65bb7785355097cf" Mar 16 00:12:37 crc kubenswrapper[4816]: E0316 00:12:37.636521 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=check-endpoints pod=network-check-source-55646444c4-trplf_openshift-network-diagnostics(9d751cbb-f2e2-430d-9754-c882a5e924a5)\"" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 16 00:12:37 crc kubenswrapper[4816]: I0316 00:12:37.639425 4816 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.158:6443: connect: connection refused" Mar 16 00:12:37 crc kubenswrapper[4816]: I0316 00:12:37.639871 4816 status_manager.go:851] "Failed to get status for pod" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-network-diagnostics/pods/network-check-source-55646444c4-trplf\": dial tcp 38.102.83.158:6443: connect: connection refused" Mar 16 00:12:37 crc kubenswrapper[4816]: I0316 00:12:37.640403 4816 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.158:6443: connect: connection refused" Mar 16 00:12:37 crc kubenswrapper[4816]: I0316 00:12:37.640758 4816 status_manager.go:851] "Failed to get status for pod" podUID="cc1ea93d-1cf8-4145-ad35-83f2d1357f9d" pod="openshift-marketplace/redhat-operators-hvpqn" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-hvpqn\": dial tcp 38.102.83.158:6443: connect: connection refused" Mar 16 00:12:37 crc kubenswrapper[4816]: I0316 00:12:37.640999 4816 status_manager.go:851] "Failed to get status for pod" podUID="a9416716-e666-46d6-9d77-fe5c9702c035" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.158:6443: connect: connection refused" Mar 16 00:12:37 crc kubenswrapper[4816]: I0316 00:12:37.667762 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 16 00:12:37 crc kubenswrapper[4816]: I0316 00:12:37.675069 4816 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.158:6443: connect: connection refused" Mar 16 00:12:37 crc kubenswrapper[4816]: I0316 00:12:37.677916 4816 status_manager.go:851] "Failed to get status for pod" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-network-diagnostics/pods/network-check-source-55646444c4-trplf\": dial tcp 38.102.83.158:6443: connect: connection refused" Mar 16 00:12:37 crc kubenswrapper[4816]: I0316 00:12:37.678395 4816 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.158:6443: connect: connection refused" Mar 16 00:12:37 crc kubenswrapper[4816]: I0316 00:12:37.679021 4816 status_manager.go:851] "Failed to get status for pod" podUID="cc1ea93d-1cf8-4145-ad35-83f2d1357f9d" pod="openshift-marketplace/redhat-operators-hvpqn" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-hvpqn\": dial tcp 38.102.83.158:6443: connect: connection refused" Mar 16 00:12:37 crc kubenswrapper[4816]: I0316 00:12:37.679428 4816 status_manager.go:851] "Failed to get status for pod" podUID="a9416716-e666-46d6-9d77-fe5c9702c035" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.158:6443: connect: connection refused" Mar 16 00:12:37 crc kubenswrapper[4816]: I0316 00:12:37.679858 4816 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.158:6443: connect: connection refused" Mar 16 00:12:37 crc kubenswrapper[4816]: I0316 00:12:37.680229 4816 status_manager.go:851] "Failed to get status for pod" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-network-diagnostics/pods/network-check-source-55646444c4-trplf\": dial tcp 38.102.83.158:6443: connect: connection refused" Mar 16 00:12:37 crc kubenswrapper[4816]: I0316 00:12:37.680676 4816 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.158:6443: connect: connection refused" Mar 16 00:12:37 crc kubenswrapper[4816]: I0316 00:12:37.681108 4816 status_manager.go:851] "Failed to get status for pod" podUID="cc1ea93d-1cf8-4145-ad35-83f2d1357f9d" pod="openshift-marketplace/redhat-operators-hvpqn" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-hvpqn\": dial tcp 38.102.83.158:6443: connect: connection refused" Mar 16 00:12:37 crc kubenswrapper[4816]: I0316 00:12:37.681470 4816 status_manager.go:851] "Failed to get status for pod" podUID="a9416716-e666-46d6-9d77-fe5c9702c035" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.158:6443: connect: connection refused" Mar 16 00:12:37 crc kubenswrapper[4816]: I0316 00:12:37.717142 4816 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="8c8786d1-025d-4788-bafe-c2c2eaf8e398" Mar 16 00:12:37 crc kubenswrapper[4816]: I0316 00:12:37.717172 4816 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="8c8786d1-025d-4788-bafe-c2c2eaf8e398" Mar 16 00:12:37 crc kubenswrapper[4816]: E0316 00:12:37.717481 4816 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.158:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 16 00:12:37 crc kubenswrapper[4816]: I0316 00:12:37.717899 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 16 00:12:37 crc kubenswrapper[4816]: W0316 00:12:37.747591 4816 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod71bb4a3aecc4ba5b26c4b7318770ce13.slice/crio-6672936e2d6cfd90abf352ef7108e1954aed5c372ffd62b1a57c84a375772f32 WatchSource:0}: Error finding container 6672936e2d6cfd90abf352ef7108e1954aed5c372ffd62b1a57c84a375772f32: Status 404 returned error can't find the container with id 6672936e2d6cfd90abf352ef7108e1954aed5c372ffd62b1a57c84a375772f32 Mar 16 00:12:38 crc kubenswrapper[4816]: I0316 00:12:38.645050 4816 generic.go:334] "Generic (PLEG): container finished" podID="71bb4a3aecc4ba5b26c4b7318770ce13" containerID="a2123cc741523f2b46f75dea80db3f916696a6a9cdfe0d6979eab60ea890fc6e" exitCode=0 Mar 16 00:12:38 crc kubenswrapper[4816]: I0316 00:12:38.645167 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerDied","Data":"a2123cc741523f2b46f75dea80db3f916696a6a9cdfe0d6979eab60ea890fc6e"} Mar 16 00:12:38 crc kubenswrapper[4816]: I0316 00:12:38.645603 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"6672936e2d6cfd90abf352ef7108e1954aed5c372ffd62b1a57c84a375772f32"} Mar 16 00:12:38 crc kubenswrapper[4816]: I0316 00:12:38.646161 4816 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="8c8786d1-025d-4788-bafe-c2c2eaf8e398" Mar 16 00:12:38 crc kubenswrapper[4816]: I0316 00:12:38.646198 4816 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="8c8786d1-025d-4788-bafe-c2c2eaf8e398" Mar 16 00:12:38 crc kubenswrapper[4816]: I0316 00:12:38.646775 4816 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.158:6443: connect: connection refused" Mar 16 00:12:38 crc kubenswrapper[4816]: E0316 00:12:38.646917 4816 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.158:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 16 00:12:38 crc kubenswrapper[4816]: I0316 00:12:38.647233 4816 status_manager.go:851] "Failed to get status for pod" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-network-diagnostics/pods/network-check-source-55646444c4-trplf\": dial tcp 38.102.83.158:6443: connect: connection refused" Mar 16 00:12:38 crc kubenswrapper[4816]: I0316 00:12:38.647730 4816 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.158:6443: connect: connection refused" Mar 16 00:12:38 crc kubenswrapper[4816]: I0316 00:12:38.648167 4816 status_manager.go:851] "Failed to get status for pod" podUID="cc1ea93d-1cf8-4145-ad35-83f2d1357f9d" pod="openshift-marketplace/redhat-operators-hvpqn" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-hvpqn\": dial tcp 38.102.83.158:6443: connect: connection refused" Mar 16 00:12:38 crc kubenswrapper[4816]: I0316 00:12:38.648678 4816 status_manager.go:851] "Failed to get status for pod" podUID="a9416716-e666-46d6-9d77-fe5c9702c035" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.158:6443: connect: connection refused" Mar 16 00:12:38 crc kubenswrapper[4816]: I0316 00:12:38.649356 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-source-55646444c4-trplf_9d751cbb-f2e2-430d-9754-c882a5e924a5/check-endpoints/2.log" Mar 16 00:12:38 crc kubenswrapper[4816]: I0316 00:12:38.727753 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 16 00:12:39 crc kubenswrapper[4816]: I0316 00:12:39.674226 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"40545204d92772841c657d6eeb43c327aa4555b2f6a8d8d3c8bc781fc6d46131"} Mar 16 00:12:39 crc kubenswrapper[4816]: I0316 00:12:39.674494 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"85b4a9ece3a249bf5d9cd638d77f53ce85d4536ea7bc057b1d9c968457155e86"} Mar 16 00:12:39 crc kubenswrapper[4816]: I0316 00:12:39.674506 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"a328b262655f75aad03f269c24749d7fbbf17a17c55477a659d8bdd35a3a18de"} Mar 16 00:12:39 crc kubenswrapper[4816]: I0316 00:12:39.674516 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"fae10584beeafc12e6ee60bc51600e75d592acdaf0bfcb1d6909e9e887d33af8"} Mar 16 00:12:40 crc kubenswrapper[4816]: I0316 00:12:40.680566 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"96fece39acca2cfb04562648fd9193a7dd8eadc42c788da332138e3a4ca4f8cc"} Mar 16 00:12:40 crc kubenswrapper[4816]: I0316 00:12:40.680896 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 16 00:12:40 crc kubenswrapper[4816]: I0316 00:12:40.681009 4816 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="8c8786d1-025d-4788-bafe-c2c2eaf8e398" Mar 16 00:12:40 crc kubenswrapper[4816]: I0316 00:12:40.681049 4816 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="8c8786d1-025d-4788-bafe-c2c2eaf8e398" Mar 16 00:12:42 crc kubenswrapper[4816]: I0316 00:12:42.583070 4816 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 16 00:12:42 crc kubenswrapper[4816]: I0316 00:12:42.583273 4816 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/kube-controller-manager namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" start-of-body= Mar 16 00:12:42 crc kubenswrapper[4816]: I0316 00:12:42.583392 4816 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" Mar 16 00:12:42 crc kubenswrapper[4816]: I0316 00:12:42.717995 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 16 00:12:42 crc kubenswrapper[4816]: I0316 00:12:42.718057 4816 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 16 00:12:42 crc kubenswrapper[4816]: I0316 00:12:42.723504 4816 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 16 00:12:45 crc kubenswrapper[4816]: I0316 00:12:45.698124 4816 kubelet.go:1914] "Deleted mirror pod because it is outdated" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 16 00:12:46 crc kubenswrapper[4816]: I0316 00:12:46.717735 4816 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="8c8786d1-025d-4788-bafe-c2c2eaf8e398" Mar 16 00:12:46 crc kubenswrapper[4816]: I0316 00:12:46.718137 4816 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="8c8786d1-025d-4788-bafe-c2c2eaf8e398" Mar 16 00:12:46 crc kubenswrapper[4816]: I0316 00:12:46.726771 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 16 00:12:47 crc kubenswrapper[4816]: I0316 00:12:47.701380 4816 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="99986f4e-b436-455c-9692-380f895c4832" Mar 16 00:12:47 crc kubenswrapper[4816]: I0316 00:12:47.722594 4816 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="8c8786d1-025d-4788-bafe-c2c2eaf8e398" Mar 16 00:12:47 crc kubenswrapper[4816]: I0316 00:12:47.722625 4816 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="8c8786d1-025d-4788-bafe-c2c2eaf8e398" Mar 16 00:12:47 crc kubenswrapper[4816]: I0316 00:12:47.725523 4816 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="99986f4e-b436-455c-9692-380f895c4832" Mar 16 00:12:51 crc kubenswrapper[4816]: I0316 00:12:51.668338 4816 scope.go:117] "RemoveContainer" containerID="bf90d2b66ba96bda100b2d7e60bddd86b4409c573e87857f65bb7785355097cf" Mar 16 00:12:51 crc kubenswrapper[4816]: E0316 00:12:51.669295 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=check-endpoints pod=network-check-source-55646444c4-trplf_openshift-network-diagnostics(9d751cbb-f2e2-430d-9754-c882a5e924a5)\"" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 16 00:12:52 crc kubenswrapper[4816]: I0316 00:12:52.587677 4816 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 16 00:12:52 crc kubenswrapper[4816]: I0316 00:12:52.592389 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 16 00:12:54 crc kubenswrapper[4816]: I0316 00:12:54.875399 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Mar 16 00:12:55 crc kubenswrapper[4816]: I0316 00:12:55.144365 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Mar 16 00:12:55 crc kubenswrapper[4816]: I0316 00:12:55.229500 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Mar 16 00:12:55 crc kubenswrapper[4816]: I0316 00:12:55.698590 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Mar 16 00:12:55 crc kubenswrapper[4816]: I0316 00:12:55.850469 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Mar 16 00:12:56 crc kubenswrapper[4816]: I0316 00:12:56.244838 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Mar 16 00:12:56 crc kubenswrapper[4816]: I0316 00:12:56.412797 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Mar 16 00:12:56 crc kubenswrapper[4816]: I0316 00:12:56.650228 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Mar 16 00:12:56 crc kubenswrapper[4816]: I0316 00:12:56.769098 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Mar 16 00:12:57 crc kubenswrapper[4816]: I0316 00:12:57.041016 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Mar 16 00:12:57 crc kubenswrapper[4816]: I0316 00:12:57.107815 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Mar 16 00:12:57 crc kubenswrapper[4816]: I0316 00:12:57.112976 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Mar 16 00:12:57 crc kubenswrapper[4816]: I0316 00:12:57.322264 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Mar 16 00:12:57 crc kubenswrapper[4816]: I0316 00:12:57.620512 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Mar 16 00:12:57 crc kubenswrapper[4816]: I0316 00:12:57.668717 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Mar 16 00:12:57 crc kubenswrapper[4816]: I0316 00:12:57.800851 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Mar 16 00:12:58 crc kubenswrapper[4816]: I0316 00:12:58.002282 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Mar 16 00:12:58 crc kubenswrapper[4816]: I0316 00:12:58.049798 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Mar 16 00:12:58 crc kubenswrapper[4816]: I0316 00:12:58.058402 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Mar 16 00:12:58 crc kubenswrapper[4816]: I0316 00:12:58.209265 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Mar 16 00:12:58 crc kubenswrapper[4816]: I0316 00:12:58.389360 4816 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Mar 16 00:12:58 crc kubenswrapper[4816]: I0316 00:12:58.546452 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Mar 16 00:12:58 crc kubenswrapper[4816]: I0316 00:12:58.627097 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Mar 16 00:12:58 crc kubenswrapper[4816]: I0316 00:12:58.673690 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Mar 16 00:12:58 crc kubenswrapper[4816]: I0316 00:12:58.680877 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Mar 16 00:12:58 crc kubenswrapper[4816]: I0316 00:12:58.720381 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Mar 16 00:12:58 crc kubenswrapper[4816]: I0316 00:12:58.753568 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Mar 16 00:12:58 crc kubenswrapper[4816]: I0316 00:12:58.789520 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Mar 16 00:12:58 crc kubenswrapper[4816]: I0316 00:12:58.910029 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Mar 16 00:12:58 crc kubenswrapper[4816]: I0316 00:12:58.949402 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Mar 16 00:12:58 crc kubenswrapper[4816]: I0316 00:12:58.998266 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 16 00:12:59 crc kubenswrapper[4816]: I0316 00:12:59.075093 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Mar 16 00:12:59 crc kubenswrapper[4816]: I0316 00:12:59.152434 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Mar 16 00:12:59 crc kubenswrapper[4816]: I0316 00:12:59.162427 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Mar 16 00:12:59 crc kubenswrapper[4816]: I0316 00:12:59.166654 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Mar 16 00:12:59 crc kubenswrapper[4816]: I0316 00:12:59.230756 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Mar 16 00:12:59 crc kubenswrapper[4816]: I0316 00:12:59.233628 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Mar 16 00:12:59 crc kubenswrapper[4816]: I0316 00:12:59.235899 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 16 00:12:59 crc kubenswrapper[4816]: I0316 00:12:59.285787 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Mar 16 00:12:59 crc kubenswrapper[4816]: I0316 00:12:59.334950 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Mar 16 00:12:59 crc kubenswrapper[4816]: I0316 00:12:59.336934 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Mar 16 00:12:59 crc kubenswrapper[4816]: I0316 00:12:59.353699 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Mar 16 00:12:59 crc kubenswrapper[4816]: I0316 00:12:59.365790 4816 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Mar 16 00:12:59 crc kubenswrapper[4816]: I0316 00:12:59.370982 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Mar 16 00:12:59 crc kubenswrapper[4816]: I0316 00:12:59.420613 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 16 00:12:59 crc kubenswrapper[4816]: I0316 00:12:59.425010 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Mar 16 00:12:59 crc kubenswrapper[4816]: I0316 00:12:59.485422 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Mar 16 00:12:59 crc kubenswrapper[4816]: I0316 00:12:59.641114 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Mar 16 00:12:59 crc kubenswrapper[4816]: I0316 00:12:59.684106 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Mar 16 00:12:59 crc kubenswrapper[4816]: I0316 00:12:59.729678 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Mar 16 00:12:59 crc kubenswrapper[4816]: I0316 00:12:59.749987 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Mar 16 00:12:59 crc kubenswrapper[4816]: I0316 00:12:59.836892 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Mar 16 00:12:59 crc kubenswrapper[4816]: I0316 00:12:59.867335 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Mar 16 00:12:59 crc kubenswrapper[4816]: I0316 00:12:59.953258 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Mar 16 00:12:59 crc kubenswrapper[4816]: I0316 00:12:59.988606 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Mar 16 00:13:00 crc kubenswrapper[4816]: I0316 00:13:00.107559 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Mar 16 00:13:00 crc kubenswrapper[4816]: I0316 00:13:00.126980 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Mar 16 00:13:00 crc kubenswrapper[4816]: I0316 00:13:00.163535 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Mar 16 00:13:00 crc kubenswrapper[4816]: I0316 00:13:00.202358 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 16 00:13:00 crc kubenswrapper[4816]: I0316 00:13:00.341944 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 16 00:13:00 crc kubenswrapper[4816]: I0316 00:13:00.403428 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Mar 16 00:13:00 crc kubenswrapper[4816]: I0316 00:13:00.409652 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Mar 16 00:13:00 crc kubenswrapper[4816]: I0316 00:13:00.428632 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Mar 16 00:13:00 crc kubenswrapper[4816]: I0316 00:13:00.431658 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Mar 16 00:13:00 crc kubenswrapper[4816]: I0316 00:13:00.431885 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Mar 16 00:13:00 crc kubenswrapper[4816]: I0316 00:13:00.460816 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Mar 16 00:13:00 crc kubenswrapper[4816]: I0316 00:13:00.489196 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Mar 16 00:13:00 crc kubenswrapper[4816]: I0316 00:13:00.502769 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Mar 16 00:13:00 crc kubenswrapper[4816]: I0316 00:13:00.531199 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Mar 16 00:13:00 crc kubenswrapper[4816]: I0316 00:13:00.537573 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Mar 16 00:13:00 crc kubenswrapper[4816]: I0316 00:13:00.591502 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Mar 16 00:13:00 crc kubenswrapper[4816]: I0316 00:13:00.608688 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Mar 16 00:13:00 crc kubenswrapper[4816]: I0316 00:13:00.783319 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Mar 16 00:13:00 crc kubenswrapper[4816]: I0316 00:13:00.835129 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Mar 16 00:13:00 crc kubenswrapper[4816]: I0316 00:13:00.903492 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Mar 16 00:13:00 crc kubenswrapper[4816]: I0316 00:13:00.909339 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Mar 16 00:13:00 crc kubenswrapper[4816]: I0316 00:13:00.987613 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Mar 16 00:13:00 crc kubenswrapper[4816]: I0316 00:13:00.998829 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Mar 16 00:13:01 crc kubenswrapper[4816]: I0316 00:13:01.076682 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Mar 16 00:13:01 crc kubenswrapper[4816]: I0316 00:13:01.107998 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Mar 16 00:13:01 crc kubenswrapper[4816]: I0316 00:13:01.159184 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Mar 16 00:13:01 crc kubenswrapper[4816]: I0316 00:13:01.183400 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Mar 16 00:13:01 crc kubenswrapper[4816]: I0316 00:13:01.331719 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Mar 16 00:13:01 crc kubenswrapper[4816]: I0316 00:13:01.359909 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Mar 16 00:13:01 crc kubenswrapper[4816]: I0316 00:13:01.430561 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Mar 16 00:13:01 crc kubenswrapper[4816]: I0316 00:13:01.474699 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Mar 16 00:13:01 crc kubenswrapper[4816]: I0316 00:13:01.487698 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Mar 16 00:13:01 crc kubenswrapper[4816]: I0316 00:13:01.506711 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Mar 16 00:13:01 crc kubenswrapper[4816]: I0316 00:13:01.521317 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Mar 16 00:13:01 crc kubenswrapper[4816]: I0316 00:13:01.535028 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Mar 16 00:13:01 crc kubenswrapper[4816]: I0316 00:13:01.578993 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Mar 16 00:13:01 crc kubenswrapper[4816]: I0316 00:13:01.607265 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Mar 16 00:13:01 crc kubenswrapper[4816]: I0316 00:13:01.608827 4816 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Mar 16 00:13:01 crc kubenswrapper[4816]: I0316 00:13:01.740841 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Mar 16 00:13:01 crc kubenswrapper[4816]: I0316 00:13:01.967745 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Mar 16 00:13:02 crc kubenswrapper[4816]: I0316 00:13:02.015530 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Mar 16 00:13:02 crc kubenswrapper[4816]: I0316 00:13:02.123954 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Mar 16 00:13:02 crc kubenswrapper[4816]: I0316 00:13:02.149431 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Mar 16 00:13:02 crc kubenswrapper[4816]: I0316 00:13:02.165201 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Mar 16 00:13:02 crc kubenswrapper[4816]: I0316 00:13:02.344955 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Mar 16 00:13:02 crc kubenswrapper[4816]: I0316 00:13:02.345172 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Mar 16 00:13:02 crc kubenswrapper[4816]: I0316 00:13:02.365856 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Mar 16 00:13:02 crc kubenswrapper[4816]: I0316 00:13:02.407295 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Mar 16 00:13:02 crc kubenswrapper[4816]: I0316 00:13:02.409473 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Mar 16 00:13:02 crc kubenswrapper[4816]: I0316 00:13:02.502808 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Mar 16 00:13:02 crc kubenswrapper[4816]: I0316 00:13:02.540702 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Mar 16 00:13:02 crc kubenswrapper[4816]: I0316 00:13:02.572172 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Mar 16 00:13:02 crc kubenswrapper[4816]: I0316 00:13:02.597166 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Mar 16 00:13:02 crc kubenswrapper[4816]: I0316 00:13:02.598058 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Mar 16 00:13:02 crc kubenswrapper[4816]: I0316 00:13:02.616647 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Mar 16 00:13:02 crc kubenswrapper[4816]: I0316 00:13:02.629393 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Mar 16 00:13:02 crc kubenswrapper[4816]: I0316 00:13:02.670688 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Mar 16 00:13:02 crc kubenswrapper[4816]: I0316 00:13:02.698246 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 16 00:13:02 crc kubenswrapper[4816]: I0316 00:13:02.726015 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Mar 16 00:13:02 crc kubenswrapper[4816]: I0316 00:13:02.741859 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Mar 16 00:13:02 crc kubenswrapper[4816]: I0316 00:13:02.826866 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Mar 16 00:13:02 crc kubenswrapper[4816]: I0316 00:13:02.921133 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Mar 16 00:13:02 crc kubenswrapper[4816]: I0316 00:13:02.938642 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 16 00:13:03 crc kubenswrapper[4816]: I0316 00:13:03.076526 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Mar 16 00:13:03 crc kubenswrapper[4816]: I0316 00:13:03.081173 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Mar 16 00:13:03 crc kubenswrapper[4816]: I0316 00:13:03.100167 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Mar 16 00:13:03 crc kubenswrapper[4816]: I0316 00:13:03.129021 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Mar 16 00:13:03 crc kubenswrapper[4816]: I0316 00:13:03.132148 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Mar 16 00:13:03 crc kubenswrapper[4816]: I0316 00:13:03.159790 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Mar 16 00:13:03 crc kubenswrapper[4816]: I0316 00:13:03.309848 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Mar 16 00:13:03 crc kubenswrapper[4816]: I0316 00:13:03.310926 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Mar 16 00:13:03 crc kubenswrapper[4816]: I0316 00:13:03.482314 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Mar 16 00:13:03 crc kubenswrapper[4816]: I0316 00:13:03.527499 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Mar 16 00:13:03 crc kubenswrapper[4816]: I0316 00:13:03.530916 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Mar 16 00:13:03 crc kubenswrapper[4816]: I0316 00:13:03.666431 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Mar 16 00:13:03 crc kubenswrapper[4816]: I0316 00:13:03.667230 4816 scope.go:117] "RemoveContainer" containerID="bf90d2b66ba96bda100b2d7e60bddd86b4409c573e87857f65bb7785355097cf" Mar 16 00:13:03 crc kubenswrapper[4816]: I0316 00:13:03.678584 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Mar 16 00:13:03 crc kubenswrapper[4816]: I0316 00:13:03.699293 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 16 00:13:03 crc kubenswrapper[4816]: I0316 00:13:03.740535 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Mar 16 00:13:03 crc kubenswrapper[4816]: I0316 00:13:03.754462 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Mar 16 00:13:03 crc kubenswrapper[4816]: I0316 00:13:03.778638 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 16 00:13:03 crc kubenswrapper[4816]: I0316 00:13:03.819680 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-source-55646444c4-trplf_9d751cbb-f2e2-430d-9754-c882a5e924a5/check-endpoints/2.log" Mar 16 00:13:03 crc kubenswrapper[4816]: I0316 00:13:03.819749 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"9ac66afe7f0f3082bc91e3215ea8df4639682a9c3ef140496b94009ae3f373e1"} Mar 16 00:13:03 crc kubenswrapper[4816]: I0316 00:13:03.885353 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Mar 16 00:13:04 crc kubenswrapper[4816]: I0316 00:13:04.007701 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Mar 16 00:13:04 crc kubenswrapper[4816]: I0316 00:13:04.054667 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Mar 16 00:13:04 crc kubenswrapper[4816]: I0316 00:13:04.107461 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Mar 16 00:13:04 crc kubenswrapper[4816]: I0316 00:13:04.162038 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Mar 16 00:13:04 crc kubenswrapper[4816]: I0316 00:13:04.252449 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Mar 16 00:13:04 crc kubenswrapper[4816]: I0316 00:13:04.552481 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Mar 16 00:13:04 crc kubenswrapper[4816]: I0316 00:13:04.773528 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Mar 16 00:13:04 crc kubenswrapper[4816]: I0316 00:13:04.781908 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Mar 16 00:13:04 crc kubenswrapper[4816]: I0316 00:13:04.848827 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Mar 16 00:13:04 crc kubenswrapper[4816]: I0316 00:13:04.879262 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Mar 16 00:13:04 crc kubenswrapper[4816]: I0316 00:13:04.970216 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Mar 16 00:13:05 crc kubenswrapper[4816]: I0316 00:13:05.034028 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Mar 16 00:13:05 crc kubenswrapper[4816]: I0316 00:13:05.050708 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Mar 16 00:13:05 crc kubenswrapper[4816]: I0316 00:13:05.057489 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Mar 16 00:13:05 crc kubenswrapper[4816]: I0316 00:13:05.077491 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Mar 16 00:13:05 crc kubenswrapper[4816]: I0316 00:13:05.145819 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Mar 16 00:13:05 crc kubenswrapper[4816]: I0316 00:13:05.261340 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Mar 16 00:13:05 crc kubenswrapper[4816]: I0316 00:13:05.314988 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Mar 16 00:13:05 crc kubenswrapper[4816]: I0316 00:13:05.372542 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Mar 16 00:13:05 crc kubenswrapper[4816]: I0316 00:13:05.510438 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Mar 16 00:13:05 crc kubenswrapper[4816]: I0316 00:13:05.545375 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Mar 16 00:13:05 crc kubenswrapper[4816]: I0316 00:13:05.572030 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Mar 16 00:13:05 crc kubenswrapper[4816]: I0316 00:13:05.580775 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Mar 16 00:13:05 crc kubenswrapper[4816]: I0316 00:13:05.630995 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Mar 16 00:13:05 crc kubenswrapper[4816]: I0316 00:13:05.642934 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Mar 16 00:13:05 crc kubenswrapper[4816]: I0316 00:13:05.697208 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Mar 16 00:13:05 crc kubenswrapper[4816]: I0316 00:13:05.702747 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Mar 16 00:13:05 crc kubenswrapper[4816]: I0316 00:13:05.749388 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Mar 16 00:13:05 crc kubenswrapper[4816]: I0316 00:13:05.784004 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Mar 16 00:13:05 crc kubenswrapper[4816]: I0316 00:13:05.820245 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Mar 16 00:13:05 crc kubenswrapper[4816]: I0316 00:13:05.843342 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Mar 16 00:13:05 crc kubenswrapper[4816]: I0316 00:13:05.885365 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Mar 16 00:13:05 crc kubenswrapper[4816]: I0316 00:13:05.930832 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Mar 16 00:13:06 crc kubenswrapper[4816]: I0316 00:13:06.073437 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Mar 16 00:13:06 crc kubenswrapper[4816]: I0316 00:13:06.135816 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Mar 16 00:13:06 crc kubenswrapper[4816]: I0316 00:13:06.141461 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Mar 16 00:13:06 crc kubenswrapper[4816]: I0316 00:13:06.192906 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Mar 16 00:13:06 crc kubenswrapper[4816]: I0316 00:13:06.223912 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Mar 16 00:13:06 crc kubenswrapper[4816]: I0316 00:13:06.224483 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 16 00:13:06 crc kubenswrapper[4816]: I0316 00:13:06.377800 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Mar 16 00:13:06 crc kubenswrapper[4816]: I0316 00:13:06.437920 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Mar 16 00:13:06 crc kubenswrapper[4816]: I0316 00:13:06.450050 4816 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Mar 16 00:13:06 crc kubenswrapper[4816]: I0316 00:13:06.560662 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Mar 16 00:13:06 crc kubenswrapper[4816]: I0316 00:13:06.569649 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Mar 16 00:13:06 crc kubenswrapper[4816]: I0316 00:13:06.652512 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Mar 16 00:13:06 crc kubenswrapper[4816]: I0316 00:13:06.666138 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Mar 16 00:13:06 crc kubenswrapper[4816]: I0316 00:13:06.705954 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Mar 16 00:13:06 crc kubenswrapper[4816]: I0316 00:13:06.759214 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Mar 16 00:13:06 crc kubenswrapper[4816]: I0316 00:13:06.808868 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Mar 16 00:13:06 crc kubenswrapper[4816]: I0316 00:13:06.938119 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Mar 16 00:13:06 crc kubenswrapper[4816]: I0316 00:13:06.979771 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Mar 16 00:13:06 crc kubenswrapper[4816]: I0316 00:13:06.980665 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Mar 16 00:13:06 crc kubenswrapper[4816]: I0316 00:13:06.997936 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 16 00:13:06 crc kubenswrapper[4816]: I0316 00:13:06.998679 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 16 00:13:07 crc kubenswrapper[4816]: I0316 00:13:07.072147 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Mar 16 00:13:07 crc kubenswrapper[4816]: I0316 00:13:07.212893 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Mar 16 00:13:07 crc kubenswrapper[4816]: I0316 00:13:07.324405 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Mar 16 00:13:07 crc kubenswrapper[4816]: I0316 00:13:07.477402 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Mar 16 00:13:07 crc kubenswrapper[4816]: I0316 00:13:07.499724 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Mar 16 00:13:07 crc kubenswrapper[4816]: I0316 00:13:07.542519 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Mar 16 00:13:07 crc kubenswrapper[4816]: I0316 00:13:07.575429 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Mar 16 00:13:07 crc kubenswrapper[4816]: I0316 00:13:07.639992 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Mar 16 00:13:07 crc kubenswrapper[4816]: I0316 00:13:07.655251 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Mar 16 00:13:07 crc kubenswrapper[4816]: I0316 00:13:07.660113 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Mar 16 00:13:07 crc kubenswrapper[4816]: I0316 00:13:07.705655 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 16 00:13:07 crc kubenswrapper[4816]: I0316 00:13:07.762756 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Mar 16 00:13:07 crc kubenswrapper[4816]: I0316 00:13:07.797639 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Mar 16 00:13:07 crc kubenswrapper[4816]: I0316 00:13:07.837704 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Mar 16 00:13:07 crc kubenswrapper[4816]: I0316 00:13:07.854005 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Mar 16 00:13:08 crc kubenswrapper[4816]: I0316 00:13:08.040946 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Mar 16 00:13:08 crc kubenswrapper[4816]: I0316 00:13:08.086944 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Mar 16 00:13:08 crc kubenswrapper[4816]: I0316 00:13:08.173245 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Mar 16 00:13:08 crc kubenswrapper[4816]: I0316 00:13:08.448790 4816 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Mar 16 00:13:08 crc kubenswrapper[4816]: I0316 00:13:08.522001 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Mar 16 00:13:08 crc kubenswrapper[4816]: I0316 00:13:08.522359 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Mar 16 00:13:08 crc kubenswrapper[4816]: I0316 00:13:08.638514 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Mar 16 00:13:08 crc kubenswrapper[4816]: I0316 00:13:08.638708 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Mar 16 00:13:08 crc kubenswrapper[4816]: I0316 00:13:08.664252 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Mar 16 00:13:08 crc kubenswrapper[4816]: I0316 00:13:08.754947 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Mar 16 00:13:08 crc kubenswrapper[4816]: I0316 00:13:08.920830 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Mar 16 00:13:09 crc kubenswrapper[4816]: I0316 00:13:09.007419 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Mar 16 00:13:09 crc kubenswrapper[4816]: I0316 00:13:09.011839 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Mar 16 00:13:09 crc kubenswrapper[4816]: I0316 00:13:09.081697 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Mar 16 00:13:09 crc kubenswrapper[4816]: I0316 00:13:09.091143 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Mar 16 00:13:09 crc kubenswrapper[4816]: I0316 00:13:09.121558 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Mar 16 00:13:09 crc kubenswrapper[4816]: I0316 00:13:09.272910 4816 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Mar 16 00:13:09 crc kubenswrapper[4816]: I0316 00:13:09.276669 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podStartSLOduration=47.276653714 podStartE2EDuration="47.276653714s" podCreationTimestamp="2026-03-16 00:12:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-16 00:12:45.778377683 +0000 UTC m=+358.874677656" watchObservedRunningTime="2026-03-16 00:13:09.276653714 +0000 UTC m=+382.372953667" Mar 16 00:13:09 crc kubenswrapper[4816]: I0316 00:13:09.277138 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Mar 16 00:13:09 crc kubenswrapper[4816]: I0316 00:13:09.277171 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Mar 16 00:13:09 crc kubenswrapper[4816]: I0316 00:13:09.283756 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 16 00:13:09 crc kubenswrapper[4816]: I0316 00:13:09.284625 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Mar 16 00:13:09 crc kubenswrapper[4816]: I0316 00:13:09.302287 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=24.302273077 podStartE2EDuration="24.302273077s" podCreationTimestamp="2026-03-16 00:12:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-16 00:13:09.301032149 +0000 UTC m=+382.397332102" watchObservedRunningTime="2026-03-16 00:13:09.302273077 +0000 UTC m=+382.398573030" Mar 16 00:13:09 crc kubenswrapper[4816]: I0316 00:13:09.372777 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Mar 16 00:13:09 crc kubenswrapper[4816]: I0316 00:13:09.395795 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Mar 16 00:13:09 crc kubenswrapper[4816]: I0316 00:13:09.505738 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Mar 16 00:13:09 crc kubenswrapper[4816]: I0316 00:13:09.541266 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Mar 16 00:13:09 crc kubenswrapper[4816]: I0316 00:13:09.738280 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Mar 16 00:13:09 crc kubenswrapper[4816]: I0316 00:13:09.738794 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Mar 16 00:13:09 crc kubenswrapper[4816]: I0316 00:13:09.861349 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Mar 16 00:13:09 crc kubenswrapper[4816]: I0316 00:13:09.980453 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Mar 16 00:13:10 crc kubenswrapper[4816]: I0316 00:13:10.025583 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Mar 16 00:13:10 crc kubenswrapper[4816]: I0316 00:13:10.294457 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Mar 16 00:13:10 crc kubenswrapper[4816]: I0316 00:13:10.345104 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 16 00:13:10 crc kubenswrapper[4816]: I0316 00:13:10.701584 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Mar 16 00:13:10 crc kubenswrapper[4816]: I0316 00:13:10.815657 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Mar 16 00:13:10 crc kubenswrapper[4816]: I0316 00:13:10.936990 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Mar 16 00:13:11 crc kubenswrapper[4816]: I0316 00:13:11.035374 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Mar 16 00:13:11 crc kubenswrapper[4816]: I0316 00:13:11.047008 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Mar 16 00:13:11 crc kubenswrapper[4816]: I0316 00:13:11.189059 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Mar 16 00:13:11 crc kubenswrapper[4816]: I0316 00:13:11.451743 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Mar 16 00:13:19 crc kubenswrapper[4816]: I0316 00:13:19.502025 4816 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Mar 16 00:13:19 crc kubenswrapper[4816]: I0316 00:13:19.503038 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" containerID="cri-o://eed2d54a8999c489e0b768ad5f7c5739ae6d65acf1bf2efc0c5987b2cb49fba4" gracePeriod=5 Mar 16 00:13:21 crc kubenswrapper[4816]: I0316 00:13:21.887272 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-7fd798dd88-7v9zs"] Mar 16 00:13:21 crc kubenswrapper[4816]: I0316 00:13:21.887486 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-7fd798dd88-7v9zs" podUID="c682ba54-60b9-4293-ba42-dbde80524daf" containerName="controller-manager" containerID="cri-o://7cefae33906fb5e888e4b14f5d736ee34b5be1bef6cc5d12987073a3493715d9" gracePeriod=30 Mar 16 00:13:21 crc kubenswrapper[4816]: I0316 00:13:21.988148 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-66c4db87dd-spzwx"] Mar 16 00:13:21 crc kubenswrapper[4816]: I0316 00:13:21.988659 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-66c4db87dd-spzwx" podUID="d843d76b-9317-42aa-848b-e3e11c3106cb" containerName="route-controller-manager" containerID="cri-o://5e88284adb127d841afb88a27a9bd12da2e089c18ae8c1f363357186ff912634" gracePeriod=30 Mar 16 00:13:22 crc kubenswrapper[4816]: I0316 00:13:22.249173 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7fd798dd88-7v9zs" Mar 16 00:13:22 crc kubenswrapper[4816]: I0316 00:13:22.307832 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-66c4db87dd-spzwx" Mar 16 00:13:22 crc kubenswrapper[4816]: I0316 00:13:22.351000 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z8d5j\" (UniqueName: \"kubernetes.io/projected/c682ba54-60b9-4293-ba42-dbde80524daf-kube-api-access-z8d5j\") pod \"c682ba54-60b9-4293-ba42-dbde80524daf\" (UID: \"c682ba54-60b9-4293-ba42-dbde80524daf\") " Mar 16 00:13:22 crc kubenswrapper[4816]: I0316 00:13:22.351053 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c682ba54-60b9-4293-ba42-dbde80524daf-serving-cert\") pod \"c682ba54-60b9-4293-ba42-dbde80524daf\" (UID: \"c682ba54-60b9-4293-ba42-dbde80524daf\") " Mar 16 00:13:22 crc kubenswrapper[4816]: I0316 00:13:22.351112 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/c682ba54-60b9-4293-ba42-dbde80524daf-proxy-ca-bundles\") pod \"c682ba54-60b9-4293-ba42-dbde80524daf\" (UID: \"c682ba54-60b9-4293-ba42-dbde80524daf\") " Mar 16 00:13:22 crc kubenswrapper[4816]: I0316 00:13:22.351260 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c682ba54-60b9-4293-ba42-dbde80524daf-client-ca\") pod \"c682ba54-60b9-4293-ba42-dbde80524daf\" (UID: \"c682ba54-60b9-4293-ba42-dbde80524daf\") " Mar 16 00:13:22 crc kubenswrapper[4816]: I0316 00:13:22.351323 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-snc4z\" (UniqueName: \"kubernetes.io/projected/d843d76b-9317-42aa-848b-e3e11c3106cb-kube-api-access-snc4z\") pod \"d843d76b-9317-42aa-848b-e3e11c3106cb\" (UID: \"d843d76b-9317-42aa-848b-e3e11c3106cb\") " Mar 16 00:13:22 crc kubenswrapper[4816]: I0316 00:13:22.351382 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c682ba54-60b9-4293-ba42-dbde80524daf-config\") pod \"c682ba54-60b9-4293-ba42-dbde80524daf\" (UID: \"c682ba54-60b9-4293-ba42-dbde80524daf\") " Mar 16 00:13:22 crc kubenswrapper[4816]: I0316 00:13:22.351419 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d843d76b-9317-42aa-848b-e3e11c3106cb-config\") pod \"d843d76b-9317-42aa-848b-e3e11c3106cb\" (UID: \"d843d76b-9317-42aa-848b-e3e11c3106cb\") " Mar 16 00:13:22 crc kubenswrapper[4816]: I0316 00:13:22.351884 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c682ba54-60b9-4293-ba42-dbde80524daf-client-ca" (OuterVolumeSpecName: "client-ca") pod "c682ba54-60b9-4293-ba42-dbde80524daf" (UID: "c682ba54-60b9-4293-ba42-dbde80524daf"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:13:22 crc kubenswrapper[4816]: I0316 00:13:22.352093 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c682ba54-60b9-4293-ba42-dbde80524daf-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "c682ba54-60b9-4293-ba42-dbde80524daf" (UID: "c682ba54-60b9-4293-ba42-dbde80524daf"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:13:22 crc kubenswrapper[4816]: I0316 00:13:22.352273 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c682ba54-60b9-4293-ba42-dbde80524daf-config" (OuterVolumeSpecName: "config") pod "c682ba54-60b9-4293-ba42-dbde80524daf" (UID: "c682ba54-60b9-4293-ba42-dbde80524daf"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:13:22 crc kubenswrapper[4816]: I0316 00:13:22.352451 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d843d76b-9317-42aa-848b-e3e11c3106cb-config" (OuterVolumeSpecName: "config") pod "d843d76b-9317-42aa-848b-e3e11c3106cb" (UID: "d843d76b-9317-42aa-848b-e3e11c3106cb"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:13:22 crc kubenswrapper[4816]: I0316 00:13:22.356324 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c682ba54-60b9-4293-ba42-dbde80524daf-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "c682ba54-60b9-4293-ba42-dbde80524daf" (UID: "c682ba54-60b9-4293-ba42-dbde80524daf"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 16 00:13:22 crc kubenswrapper[4816]: I0316 00:13:22.356358 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d843d76b-9317-42aa-848b-e3e11c3106cb-kube-api-access-snc4z" (OuterVolumeSpecName: "kube-api-access-snc4z") pod "d843d76b-9317-42aa-848b-e3e11c3106cb" (UID: "d843d76b-9317-42aa-848b-e3e11c3106cb"). InnerVolumeSpecName "kube-api-access-snc4z". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:13:22 crc kubenswrapper[4816]: I0316 00:13:22.356482 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c682ba54-60b9-4293-ba42-dbde80524daf-kube-api-access-z8d5j" (OuterVolumeSpecName: "kube-api-access-z8d5j") pod "c682ba54-60b9-4293-ba42-dbde80524daf" (UID: "c682ba54-60b9-4293-ba42-dbde80524daf"). InnerVolumeSpecName "kube-api-access-z8d5j". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:13:22 crc kubenswrapper[4816]: I0316 00:13:22.452754 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d843d76b-9317-42aa-848b-e3e11c3106cb-serving-cert\") pod \"d843d76b-9317-42aa-848b-e3e11c3106cb\" (UID: \"d843d76b-9317-42aa-848b-e3e11c3106cb\") " Mar 16 00:13:22 crc kubenswrapper[4816]: I0316 00:13:22.452842 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d843d76b-9317-42aa-848b-e3e11c3106cb-client-ca\") pod \"d843d76b-9317-42aa-848b-e3e11c3106cb\" (UID: \"d843d76b-9317-42aa-848b-e3e11c3106cb\") " Mar 16 00:13:22 crc kubenswrapper[4816]: I0316 00:13:22.452970 4816 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c682ba54-60b9-4293-ba42-dbde80524daf-config\") on node \"crc\" DevicePath \"\"" Mar 16 00:13:22 crc kubenswrapper[4816]: I0316 00:13:22.452980 4816 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d843d76b-9317-42aa-848b-e3e11c3106cb-config\") on node \"crc\" DevicePath \"\"" Mar 16 00:13:22 crc kubenswrapper[4816]: I0316 00:13:22.452989 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z8d5j\" (UniqueName: \"kubernetes.io/projected/c682ba54-60b9-4293-ba42-dbde80524daf-kube-api-access-z8d5j\") on node \"crc\" DevicePath \"\"" Mar 16 00:13:22 crc kubenswrapper[4816]: I0316 00:13:22.452999 4816 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c682ba54-60b9-4293-ba42-dbde80524daf-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 16 00:13:22 crc kubenswrapper[4816]: I0316 00:13:22.453008 4816 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/c682ba54-60b9-4293-ba42-dbde80524daf-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 16 00:13:22 crc kubenswrapper[4816]: I0316 00:13:22.453017 4816 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c682ba54-60b9-4293-ba42-dbde80524daf-client-ca\") on node \"crc\" DevicePath \"\"" Mar 16 00:13:22 crc kubenswrapper[4816]: I0316 00:13:22.453025 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-snc4z\" (UniqueName: \"kubernetes.io/projected/d843d76b-9317-42aa-848b-e3e11c3106cb-kube-api-access-snc4z\") on node \"crc\" DevicePath \"\"" Mar 16 00:13:22 crc kubenswrapper[4816]: I0316 00:13:22.453552 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d843d76b-9317-42aa-848b-e3e11c3106cb-client-ca" (OuterVolumeSpecName: "client-ca") pod "d843d76b-9317-42aa-848b-e3e11c3106cb" (UID: "d843d76b-9317-42aa-848b-e3e11c3106cb"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:13:22 crc kubenswrapper[4816]: I0316 00:13:22.456509 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d843d76b-9317-42aa-848b-e3e11c3106cb-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "d843d76b-9317-42aa-848b-e3e11c3106cb" (UID: "d843d76b-9317-42aa-848b-e3e11c3106cb"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 16 00:13:22 crc kubenswrapper[4816]: I0316 00:13:22.554264 4816 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d843d76b-9317-42aa-848b-e3e11c3106cb-client-ca\") on node \"crc\" DevicePath \"\"" Mar 16 00:13:22 crc kubenswrapper[4816]: I0316 00:13:22.554292 4816 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d843d76b-9317-42aa-848b-e3e11c3106cb-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 16 00:13:22 crc kubenswrapper[4816]: I0316 00:13:22.615055 4816 generic.go:334] "Generic (PLEG): container finished" podID="c682ba54-60b9-4293-ba42-dbde80524daf" containerID="7cefae33906fb5e888e4b14f5d736ee34b5be1bef6cc5d12987073a3493715d9" exitCode=0 Mar 16 00:13:22 crc kubenswrapper[4816]: I0316 00:13:22.615121 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7fd798dd88-7v9zs" event={"ID":"c682ba54-60b9-4293-ba42-dbde80524daf","Type":"ContainerDied","Data":"7cefae33906fb5e888e4b14f5d736ee34b5be1bef6cc5d12987073a3493715d9"} Mar 16 00:13:22 crc kubenswrapper[4816]: I0316 00:13:22.615148 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7fd798dd88-7v9zs" event={"ID":"c682ba54-60b9-4293-ba42-dbde80524daf","Type":"ContainerDied","Data":"3c1f3c1914f9bb0abf88eda5dfdf58f6ce50fa7199c9993ff27cf3aef4e09894"} Mar 16 00:13:22 crc kubenswrapper[4816]: I0316 00:13:22.615164 4816 scope.go:117] "RemoveContainer" containerID="7cefae33906fb5e888e4b14f5d736ee34b5be1bef6cc5d12987073a3493715d9" Mar 16 00:13:22 crc kubenswrapper[4816]: I0316 00:13:22.615196 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7fd798dd88-7v9zs" Mar 16 00:13:22 crc kubenswrapper[4816]: I0316 00:13:22.625804 4816 generic.go:334] "Generic (PLEG): container finished" podID="d843d76b-9317-42aa-848b-e3e11c3106cb" containerID="5e88284adb127d841afb88a27a9bd12da2e089c18ae8c1f363357186ff912634" exitCode=0 Mar 16 00:13:22 crc kubenswrapper[4816]: I0316 00:13:22.625844 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-66c4db87dd-spzwx" event={"ID":"d843d76b-9317-42aa-848b-e3e11c3106cb","Type":"ContainerDied","Data":"5e88284adb127d841afb88a27a9bd12da2e089c18ae8c1f363357186ff912634"} Mar 16 00:13:22 crc kubenswrapper[4816]: I0316 00:13:22.625888 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-66c4db87dd-spzwx" event={"ID":"d843d76b-9317-42aa-848b-e3e11c3106cb","Type":"ContainerDied","Data":"f342289f8f13f5d89f00dac92a6b213282ffa583b6c1a48b772dae90dc55fd82"} Mar 16 00:13:22 crc kubenswrapper[4816]: I0316 00:13:22.626056 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-66c4db87dd-spzwx" Mar 16 00:13:22 crc kubenswrapper[4816]: I0316 00:13:22.646017 4816 scope.go:117] "RemoveContainer" containerID="7cefae33906fb5e888e4b14f5d736ee34b5be1bef6cc5d12987073a3493715d9" Mar 16 00:13:22 crc kubenswrapper[4816]: E0316 00:13:22.646513 4816 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7cefae33906fb5e888e4b14f5d736ee34b5be1bef6cc5d12987073a3493715d9\": container with ID starting with 7cefae33906fb5e888e4b14f5d736ee34b5be1bef6cc5d12987073a3493715d9 not found: ID does not exist" containerID="7cefae33906fb5e888e4b14f5d736ee34b5be1bef6cc5d12987073a3493715d9" Mar 16 00:13:22 crc kubenswrapper[4816]: I0316 00:13:22.646541 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7cefae33906fb5e888e4b14f5d736ee34b5be1bef6cc5d12987073a3493715d9"} err="failed to get container status \"7cefae33906fb5e888e4b14f5d736ee34b5be1bef6cc5d12987073a3493715d9\": rpc error: code = NotFound desc = could not find container \"7cefae33906fb5e888e4b14f5d736ee34b5be1bef6cc5d12987073a3493715d9\": container with ID starting with 7cefae33906fb5e888e4b14f5d736ee34b5be1bef6cc5d12987073a3493715d9 not found: ID does not exist" Mar 16 00:13:22 crc kubenswrapper[4816]: I0316 00:13:22.646576 4816 scope.go:117] "RemoveContainer" containerID="5e88284adb127d841afb88a27a9bd12da2e089c18ae8c1f363357186ff912634" Mar 16 00:13:22 crc kubenswrapper[4816]: I0316 00:13:22.660948 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-7fd798dd88-7v9zs"] Mar 16 00:13:22 crc kubenswrapper[4816]: I0316 00:13:22.665823 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-7fd798dd88-7v9zs"] Mar 16 00:13:22 crc kubenswrapper[4816]: I0316 00:13:22.671421 4816 scope.go:117] "RemoveContainer" containerID="5e88284adb127d841afb88a27a9bd12da2e089c18ae8c1f363357186ff912634" Mar 16 00:13:22 crc kubenswrapper[4816]: E0316 00:13:22.672063 4816 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5e88284adb127d841afb88a27a9bd12da2e089c18ae8c1f363357186ff912634\": container with ID starting with 5e88284adb127d841afb88a27a9bd12da2e089c18ae8c1f363357186ff912634 not found: ID does not exist" containerID="5e88284adb127d841afb88a27a9bd12da2e089c18ae8c1f363357186ff912634" Mar 16 00:13:22 crc kubenswrapper[4816]: I0316 00:13:22.672134 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5e88284adb127d841afb88a27a9bd12da2e089c18ae8c1f363357186ff912634"} err="failed to get container status \"5e88284adb127d841afb88a27a9bd12da2e089c18ae8c1f363357186ff912634\": rpc error: code = NotFound desc = could not find container \"5e88284adb127d841afb88a27a9bd12da2e089c18ae8c1f363357186ff912634\": container with ID starting with 5e88284adb127d841afb88a27a9bd12da2e089c18ae8c1f363357186ff912634 not found: ID does not exist" Mar 16 00:13:22 crc kubenswrapper[4816]: I0316 00:13:22.686271 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-66c4db87dd-spzwx"] Mar 16 00:13:22 crc kubenswrapper[4816]: I0316 00:13:22.694687 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-66c4db87dd-spzwx"] Mar 16 00:13:23 crc kubenswrapper[4816]: I0316 00:13:23.454078 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Mar 16 00:13:23 crc kubenswrapper[4816]: I0316 00:13:23.484791 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-57f9cb4856-kxssx"] Mar 16 00:13:23 crc kubenswrapper[4816]: E0316 00:13:23.485324 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c682ba54-60b9-4293-ba42-dbde80524daf" containerName="controller-manager" Mar 16 00:13:23 crc kubenswrapper[4816]: I0316 00:13:23.485415 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="c682ba54-60b9-4293-ba42-dbde80524daf" containerName="controller-manager" Mar 16 00:13:23 crc kubenswrapper[4816]: E0316 00:13:23.485505 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a9416716-e666-46d6-9d77-fe5c9702c035" containerName="installer" Mar 16 00:13:23 crc kubenswrapper[4816]: I0316 00:13:23.485606 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="a9416716-e666-46d6-9d77-fe5c9702c035" containerName="installer" Mar 16 00:13:23 crc kubenswrapper[4816]: E0316 00:13:23.485759 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d843d76b-9317-42aa-848b-e3e11c3106cb" containerName="route-controller-manager" Mar 16 00:13:23 crc kubenswrapper[4816]: I0316 00:13:23.485854 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="d843d76b-9317-42aa-848b-e3e11c3106cb" containerName="route-controller-manager" Mar 16 00:13:23 crc kubenswrapper[4816]: E0316 00:13:23.485952 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Mar 16 00:13:23 crc kubenswrapper[4816]: I0316 00:13:23.486025 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Mar 16 00:13:23 crc kubenswrapper[4816]: I0316 00:13:23.486243 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Mar 16 00:13:23 crc kubenswrapper[4816]: I0316 00:13:23.486336 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="d843d76b-9317-42aa-848b-e3e11c3106cb" containerName="route-controller-manager" Mar 16 00:13:23 crc kubenswrapper[4816]: I0316 00:13:23.486427 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="c682ba54-60b9-4293-ba42-dbde80524daf" containerName="controller-manager" Mar 16 00:13:23 crc kubenswrapper[4816]: I0316 00:13:23.486532 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="a9416716-e666-46d6-9d77-fe5c9702c035" containerName="installer" Mar 16 00:13:23 crc kubenswrapper[4816]: I0316 00:13:23.487082 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-57f9cb4856-kxssx" Mar 16 00:13:23 crc kubenswrapper[4816]: I0316 00:13:23.488835 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-647cdcffd4-5z6tn"] Mar 16 00:13:23 crc kubenswrapper[4816]: I0316 00:13:23.489522 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-647cdcffd4-5z6tn" Mar 16 00:13:23 crc kubenswrapper[4816]: I0316 00:13:23.491465 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 16 00:13:23 crc kubenswrapper[4816]: I0316 00:13:23.494998 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 16 00:13:23 crc kubenswrapper[4816]: I0316 00:13:23.495084 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 16 00:13:23 crc kubenswrapper[4816]: I0316 00:13:23.495154 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 16 00:13:23 crc kubenswrapper[4816]: I0316 00:13:23.495513 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 16 00:13:23 crc kubenswrapper[4816]: I0316 00:13:23.496028 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 16 00:13:23 crc kubenswrapper[4816]: I0316 00:13:23.496275 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 16 00:13:23 crc kubenswrapper[4816]: I0316 00:13:23.496890 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 16 00:13:23 crc kubenswrapper[4816]: I0316 00:13:23.497414 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 16 00:13:23 crc kubenswrapper[4816]: I0316 00:13:23.497747 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 16 00:13:23 crc kubenswrapper[4816]: I0316 00:13:23.498125 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 16 00:13:23 crc kubenswrapper[4816]: I0316 00:13:23.501725 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 16 00:13:23 crc kubenswrapper[4816]: I0316 00:13:23.502290 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 16 00:13:23 crc kubenswrapper[4816]: I0316 00:13:23.509680 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-57f9cb4856-kxssx"] Mar 16 00:13:23 crc kubenswrapper[4816]: I0316 00:13:23.515534 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-647cdcffd4-5z6tn"] Mar 16 00:13:23 crc kubenswrapper[4816]: I0316 00:13:23.568163 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xl2vf\" (UniqueName: \"kubernetes.io/projected/9bbefecc-c220-48cf-b69a-61571c8ad48f-kube-api-access-xl2vf\") pod \"controller-manager-57f9cb4856-kxssx\" (UID: \"9bbefecc-c220-48cf-b69a-61571c8ad48f\") " pod="openshift-controller-manager/controller-manager-57f9cb4856-kxssx" Mar 16 00:13:23 crc kubenswrapper[4816]: I0316 00:13:23.568229 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9bbefecc-c220-48cf-b69a-61571c8ad48f-config\") pod \"controller-manager-57f9cb4856-kxssx\" (UID: \"9bbefecc-c220-48cf-b69a-61571c8ad48f\") " pod="openshift-controller-manager/controller-manager-57f9cb4856-kxssx" Mar 16 00:13:23 crc kubenswrapper[4816]: I0316 00:13:23.568262 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bt8nr\" (UniqueName: \"kubernetes.io/projected/6fba9f52-b53b-4cbf-8bfb-0fe0938048c4-kube-api-access-bt8nr\") pod \"route-controller-manager-647cdcffd4-5z6tn\" (UID: \"6fba9f52-b53b-4cbf-8bfb-0fe0938048c4\") " pod="openshift-route-controller-manager/route-controller-manager-647cdcffd4-5z6tn" Mar 16 00:13:23 crc kubenswrapper[4816]: I0316 00:13:23.568342 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/9bbefecc-c220-48cf-b69a-61571c8ad48f-proxy-ca-bundles\") pod \"controller-manager-57f9cb4856-kxssx\" (UID: \"9bbefecc-c220-48cf-b69a-61571c8ad48f\") " pod="openshift-controller-manager/controller-manager-57f9cb4856-kxssx" Mar 16 00:13:23 crc kubenswrapper[4816]: I0316 00:13:23.568369 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9bbefecc-c220-48cf-b69a-61571c8ad48f-client-ca\") pod \"controller-manager-57f9cb4856-kxssx\" (UID: \"9bbefecc-c220-48cf-b69a-61571c8ad48f\") " pod="openshift-controller-manager/controller-manager-57f9cb4856-kxssx" Mar 16 00:13:23 crc kubenswrapper[4816]: I0316 00:13:23.568389 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/6fba9f52-b53b-4cbf-8bfb-0fe0938048c4-client-ca\") pod \"route-controller-manager-647cdcffd4-5z6tn\" (UID: \"6fba9f52-b53b-4cbf-8bfb-0fe0938048c4\") " pod="openshift-route-controller-manager/route-controller-manager-647cdcffd4-5z6tn" Mar 16 00:13:23 crc kubenswrapper[4816]: I0316 00:13:23.568459 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9bbefecc-c220-48cf-b69a-61571c8ad48f-serving-cert\") pod \"controller-manager-57f9cb4856-kxssx\" (UID: \"9bbefecc-c220-48cf-b69a-61571c8ad48f\") " pod="openshift-controller-manager/controller-manager-57f9cb4856-kxssx" Mar 16 00:13:23 crc kubenswrapper[4816]: I0316 00:13:23.568524 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6fba9f52-b53b-4cbf-8bfb-0fe0938048c4-serving-cert\") pod \"route-controller-manager-647cdcffd4-5z6tn\" (UID: \"6fba9f52-b53b-4cbf-8bfb-0fe0938048c4\") " pod="openshift-route-controller-manager/route-controller-manager-647cdcffd4-5z6tn" Mar 16 00:13:23 crc kubenswrapper[4816]: I0316 00:13:23.568607 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6fba9f52-b53b-4cbf-8bfb-0fe0938048c4-config\") pod \"route-controller-manager-647cdcffd4-5z6tn\" (UID: \"6fba9f52-b53b-4cbf-8bfb-0fe0938048c4\") " pod="openshift-route-controller-manager/route-controller-manager-647cdcffd4-5z6tn" Mar 16 00:13:23 crc kubenswrapper[4816]: I0316 00:13:23.669401 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xl2vf\" (UniqueName: \"kubernetes.io/projected/9bbefecc-c220-48cf-b69a-61571c8ad48f-kube-api-access-xl2vf\") pod \"controller-manager-57f9cb4856-kxssx\" (UID: \"9bbefecc-c220-48cf-b69a-61571c8ad48f\") " pod="openshift-controller-manager/controller-manager-57f9cb4856-kxssx" Mar 16 00:13:23 crc kubenswrapper[4816]: I0316 00:13:23.669489 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9bbefecc-c220-48cf-b69a-61571c8ad48f-config\") pod \"controller-manager-57f9cb4856-kxssx\" (UID: \"9bbefecc-c220-48cf-b69a-61571c8ad48f\") " pod="openshift-controller-manager/controller-manager-57f9cb4856-kxssx" Mar 16 00:13:23 crc kubenswrapper[4816]: I0316 00:13:23.669531 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bt8nr\" (UniqueName: \"kubernetes.io/projected/6fba9f52-b53b-4cbf-8bfb-0fe0938048c4-kube-api-access-bt8nr\") pod \"route-controller-manager-647cdcffd4-5z6tn\" (UID: \"6fba9f52-b53b-4cbf-8bfb-0fe0938048c4\") " pod="openshift-route-controller-manager/route-controller-manager-647cdcffd4-5z6tn" Mar 16 00:13:23 crc kubenswrapper[4816]: I0316 00:13:23.669605 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/9bbefecc-c220-48cf-b69a-61571c8ad48f-proxy-ca-bundles\") pod \"controller-manager-57f9cb4856-kxssx\" (UID: \"9bbefecc-c220-48cf-b69a-61571c8ad48f\") " pod="openshift-controller-manager/controller-manager-57f9cb4856-kxssx" Mar 16 00:13:23 crc kubenswrapper[4816]: I0316 00:13:23.669642 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9bbefecc-c220-48cf-b69a-61571c8ad48f-client-ca\") pod \"controller-manager-57f9cb4856-kxssx\" (UID: \"9bbefecc-c220-48cf-b69a-61571c8ad48f\") " pod="openshift-controller-manager/controller-manager-57f9cb4856-kxssx" Mar 16 00:13:23 crc kubenswrapper[4816]: I0316 00:13:23.669753 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/6fba9f52-b53b-4cbf-8bfb-0fe0938048c4-client-ca\") pod \"route-controller-manager-647cdcffd4-5z6tn\" (UID: \"6fba9f52-b53b-4cbf-8bfb-0fe0938048c4\") " pod="openshift-route-controller-manager/route-controller-manager-647cdcffd4-5z6tn" Mar 16 00:13:23 crc kubenswrapper[4816]: I0316 00:13:23.669780 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9bbefecc-c220-48cf-b69a-61571c8ad48f-serving-cert\") pod \"controller-manager-57f9cb4856-kxssx\" (UID: \"9bbefecc-c220-48cf-b69a-61571c8ad48f\") " pod="openshift-controller-manager/controller-manager-57f9cb4856-kxssx" Mar 16 00:13:23 crc kubenswrapper[4816]: I0316 00:13:23.669818 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6fba9f52-b53b-4cbf-8bfb-0fe0938048c4-serving-cert\") pod \"route-controller-manager-647cdcffd4-5z6tn\" (UID: \"6fba9f52-b53b-4cbf-8bfb-0fe0938048c4\") " pod="openshift-route-controller-manager/route-controller-manager-647cdcffd4-5z6tn" Mar 16 00:13:23 crc kubenswrapper[4816]: I0316 00:13:23.669860 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6fba9f52-b53b-4cbf-8bfb-0fe0938048c4-config\") pod \"route-controller-manager-647cdcffd4-5z6tn\" (UID: \"6fba9f52-b53b-4cbf-8bfb-0fe0938048c4\") " pod="openshift-route-controller-manager/route-controller-manager-647cdcffd4-5z6tn" Mar 16 00:13:23 crc kubenswrapper[4816]: I0316 00:13:23.671261 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6fba9f52-b53b-4cbf-8bfb-0fe0938048c4-config\") pod \"route-controller-manager-647cdcffd4-5z6tn\" (UID: \"6fba9f52-b53b-4cbf-8bfb-0fe0938048c4\") " pod="openshift-route-controller-manager/route-controller-manager-647cdcffd4-5z6tn" Mar 16 00:13:23 crc kubenswrapper[4816]: I0316 00:13:23.675858 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/9bbefecc-c220-48cf-b69a-61571c8ad48f-proxy-ca-bundles\") pod \"controller-manager-57f9cb4856-kxssx\" (UID: \"9bbefecc-c220-48cf-b69a-61571c8ad48f\") " pod="openshift-controller-manager/controller-manager-57f9cb4856-kxssx" Mar 16 00:13:23 crc kubenswrapper[4816]: I0316 00:13:23.676527 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9bbefecc-c220-48cf-b69a-61571c8ad48f-config\") pod \"controller-manager-57f9cb4856-kxssx\" (UID: \"9bbefecc-c220-48cf-b69a-61571c8ad48f\") " pod="openshift-controller-manager/controller-manager-57f9cb4856-kxssx" Mar 16 00:13:23 crc kubenswrapper[4816]: I0316 00:13:23.676703 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c682ba54-60b9-4293-ba42-dbde80524daf" path="/var/lib/kubelet/pods/c682ba54-60b9-4293-ba42-dbde80524daf/volumes" Mar 16 00:13:23 crc kubenswrapper[4816]: I0316 00:13:23.677423 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d843d76b-9317-42aa-848b-e3e11c3106cb" path="/var/lib/kubelet/pods/d843d76b-9317-42aa-848b-e3e11c3106cb/volumes" Mar 16 00:13:23 crc kubenswrapper[4816]: I0316 00:13:23.681345 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6fba9f52-b53b-4cbf-8bfb-0fe0938048c4-serving-cert\") pod \"route-controller-manager-647cdcffd4-5z6tn\" (UID: \"6fba9f52-b53b-4cbf-8bfb-0fe0938048c4\") " pod="openshift-route-controller-manager/route-controller-manager-647cdcffd4-5z6tn" Mar 16 00:13:23 crc kubenswrapper[4816]: I0316 00:13:23.683190 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9bbefecc-c220-48cf-b69a-61571c8ad48f-serving-cert\") pod \"controller-manager-57f9cb4856-kxssx\" (UID: \"9bbefecc-c220-48cf-b69a-61571c8ad48f\") " pod="openshift-controller-manager/controller-manager-57f9cb4856-kxssx" Mar 16 00:13:23 crc kubenswrapper[4816]: I0316 00:13:23.685525 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/6fba9f52-b53b-4cbf-8bfb-0fe0938048c4-client-ca\") pod \"route-controller-manager-647cdcffd4-5z6tn\" (UID: \"6fba9f52-b53b-4cbf-8bfb-0fe0938048c4\") " pod="openshift-route-controller-manager/route-controller-manager-647cdcffd4-5z6tn" Mar 16 00:13:23 crc kubenswrapper[4816]: I0316 00:13:23.686811 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9bbefecc-c220-48cf-b69a-61571c8ad48f-client-ca\") pod \"controller-manager-57f9cb4856-kxssx\" (UID: \"9bbefecc-c220-48cf-b69a-61571c8ad48f\") " pod="openshift-controller-manager/controller-manager-57f9cb4856-kxssx" Mar 16 00:13:23 crc kubenswrapper[4816]: I0316 00:13:23.690135 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xl2vf\" (UniqueName: \"kubernetes.io/projected/9bbefecc-c220-48cf-b69a-61571c8ad48f-kube-api-access-xl2vf\") pod \"controller-manager-57f9cb4856-kxssx\" (UID: \"9bbefecc-c220-48cf-b69a-61571c8ad48f\") " pod="openshift-controller-manager/controller-manager-57f9cb4856-kxssx" Mar 16 00:13:23 crc kubenswrapper[4816]: I0316 00:13:23.693506 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bt8nr\" (UniqueName: \"kubernetes.io/projected/6fba9f52-b53b-4cbf-8bfb-0fe0938048c4-kube-api-access-bt8nr\") pod \"route-controller-manager-647cdcffd4-5z6tn\" (UID: \"6fba9f52-b53b-4cbf-8bfb-0fe0938048c4\") " pod="openshift-route-controller-manager/route-controller-manager-647cdcffd4-5z6tn" Mar 16 00:13:23 crc kubenswrapper[4816]: I0316 00:13:23.818454 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-57f9cb4856-kxssx" Mar 16 00:13:23 crc kubenswrapper[4816]: I0316 00:13:23.841599 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-647cdcffd4-5z6tn" Mar 16 00:13:24 crc kubenswrapper[4816]: I0316 00:13:24.023519 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-57f9cb4856-kxssx"] Mar 16 00:13:24 crc kubenswrapper[4816]: I0316 00:13:24.089883 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-647cdcffd4-5z6tn"] Mar 16 00:13:24 crc kubenswrapper[4816]: W0316 00:13:24.098445 4816 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6fba9f52_b53b_4cbf_8bfb_0fe0938048c4.slice/crio-0a59117cef9e6d9e6822a03d57d6d20cce4fe9b81cd544f8d6df885b7a59aaf4 WatchSource:0}: Error finding container 0a59117cef9e6d9e6822a03d57d6d20cce4fe9b81cd544f8d6df885b7a59aaf4: Status 404 returned error can't find the container with id 0a59117cef9e6d9e6822a03d57d6d20cce4fe9b81cd544f8d6df885b7a59aaf4 Mar 16 00:13:24 crc kubenswrapper[4816]: I0316 00:13:24.620942 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Mar 16 00:13:24 crc kubenswrapper[4816]: I0316 00:13:24.621006 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 16 00:13:24 crc kubenswrapper[4816]: I0316 00:13:24.639526 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-57f9cb4856-kxssx" event={"ID":"9bbefecc-c220-48cf-b69a-61571c8ad48f","Type":"ContainerStarted","Data":"f4e58123a4dbd0b4ef4c934219a91e7f9e88f653dec47627ad6279cb525f5a3c"} Mar 16 00:13:24 crc kubenswrapper[4816]: I0316 00:13:24.639603 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-57f9cb4856-kxssx" event={"ID":"9bbefecc-c220-48cf-b69a-61571c8ad48f","Type":"ContainerStarted","Data":"03fd2c051d734f0a0c50d6dc9512f0c74423439d3b3fa7ab8edc636db3d5dc09"} Mar 16 00:13:24 crc kubenswrapper[4816]: I0316 00:13:24.639627 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-57f9cb4856-kxssx" Mar 16 00:13:24 crc kubenswrapper[4816]: I0316 00:13:24.641952 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Mar 16 00:13:24 crc kubenswrapper[4816]: I0316 00:13:24.642011 4816 generic.go:334] "Generic (PLEG): container finished" podID="f85e55b1a89d02b0cb034b1ea31ed45a" containerID="eed2d54a8999c489e0b768ad5f7c5739ae6d65acf1bf2efc0c5987b2cb49fba4" exitCode=137 Mar 16 00:13:24 crc kubenswrapper[4816]: I0316 00:13:24.642086 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 16 00:13:24 crc kubenswrapper[4816]: I0316 00:13:24.642092 4816 scope.go:117] "RemoveContainer" containerID="eed2d54a8999c489e0b768ad5f7c5739ae6d65acf1bf2efc0c5987b2cb49fba4" Mar 16 00:13:24 crc kubenswrapper[4816]: I0316 00:13:24.644039 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-647cdcffd4-5z6tn" event={"ID":"6fba9f52-b53b-4cbf-8bfb-0fe0938048c4","Type":"ContainerStarted","Data":"f8227ce649fe9ebfcf381abbf843e9fc12467efbe2942b266524e1c1a986c8ec"} Mar 16 00:13:24 crc kubenswrapper[4816]: I0316 00:13:24.644101 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-647cdcffd4-5z6tn" event={"ID":"6fba9f52-b53b-4cbf-8bfb-0fe0938048c4","Type":"ContainerStarted","Data":"0a59117cef9e6d9e6822a03d57d6d20cce4fe9b81cd544f8d6df885b7a59aaf4"} Mar 16 00:13:24 crc kubenswrapper[4816]: I0316 00:13:24.644397 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-647cdcffd4-5z6tn" Mar 16 00:13:24 crc kubenswrapper[4816]: I0316 00:13:24.645731 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-57f9cb4856-kxssx" Mar 16 00:13:24 crc kubenswrapper[4816]: I0316 00:13:24.649386 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-647cdcffd4-5z6tn" Mar 16 00:13:24 crc kubenswrapper[4816]: I0316 00:13:24.656462 4816 scope.go:117] "RemoveContainer" containerID="eed2d54a8999c489e0b768ad5f7c5739ae6d65acf1bf2efc0c5987b2cb49fba4" Mar 16 00:13:24 crc kubenswrapper[4816]: E0316 00:13:24.659854 4816 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"eed2d54a8999c489e0b768ad5f7c5739ae6d65acf1bf2efc0c5987b2cb49fba4\": container with ID starting with eed2d54a8999c489e0b768ad5f7c5739ae6d65acf1bf2efc0c5987b2cb49fba4 not found: ID does not exist" containerID="eed2d54a8999c489e0b768ad5f7c5739ae6d65acf1bf2efc0c5987b2cb49fba4" Mar 16 00:13:24 crc kubenswrapper[4816]: I0316 00:13:24.659894 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eed2d54a8999c489e0b768ad5f7c5739ae6d65acf1bf2efc0c5987b2cb49fba4"} err="failed to get container status \"eed2d54a8999c489e0b768ad5f7c5739ae6d65acf1bf2efc0c5987b2cb49fba4\": rpc error: code = NotFound desc = could not find container \"eed2d54a8999c489e0b768ad5f7c5739ae6d65acf1bf2efc0c5987b2cb49fba4\": container with ID starting with eed2d54a8999c489e0b768ad5f7c5739ae6d65acf1bf2efc0c5987b2cb49fba4 not found: ID does not exist" Mar 16 00:13:24 crc kubenswrapper[4816]: I0316 00:13:24.672833 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-57f9cb4856-kxssx" podStartSLOduration=3.672813867 podStartE2EDuration="3.672813867s" podCreationTimestamp="2026-03-16 00:13:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-16 00:13:24.669308815 +0000 UTC m=+397.765608778" watchObservedRunningTime="2026-03-16 00:13:24.672813867 +0000 UTC m=+397.769113820" Mar 16 00:13:24 crc kubenswrapper[4816]: I0316 00:13:24.695209 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-647cdcffd4-5z6tn" podStartSLOduration=2.6951935110000003 podStartE2EDuration="2.695193511s" podCreationTimestamp="2026-03-16 00:13:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-16 00:13:24.693758249 +0000 UTC m=+397.790058202" watchObservedRunningTime="2026-03-16 00:13:24.695193511 +0000 UTC m=+397.791493464" Mar 16 00:13:24 crc kubenswrapper[4816]: I0316 00:13:24.783032 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 16 00:13:24 crc kubenswrapper[4816]: I0316 00:13:24.783098 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 16 00:13:24 crc kubenswrapper[4816]: I0316 00:13:24.783123 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 16 00:13:24 crc kubenswrapper[4816]: I0316 00:13:24.783159 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 16 00:13:24 crc kubenswrapper[4816]: I0316 00:13:24.783202 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 16 00:13:24 crc kubenswrapper[4816]: I0316 00:13:24.783207 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock" (OuterVolumeSpecName: "var-lock") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 16 00:13:24 crc kubenswrapper[4816]: I0316 00:13:24.783548 4816 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") on node \"crc\" DevicePath \"\"" Mar 16 00:13:24 crc kubenswrapper[4816]: I0316 00:13:24.783812 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 16 00:13:24 crc kubenswrapper[4816]: I0316 00:13:24.784261 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log" (OuterVolumeSpecName: "var-log") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 16 00:13:24 crc kubenswrapper[4816]: I0316 00:13:24.784576 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests" (OuterVolumeSpecName: "manifests") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "manifests". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 16 00:13:24 crc kubenswrapper[4816]: I0316 00:13:24.792535 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir" (OuterVolumeSpecName: "pod-resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "pod-resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 16 00:13:24 crc kubenswrapper[4816]: I0316 00:13:24.884428 4816 reconciler_common.go:293] "Volume detached for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") on node \"crc\" DevicePath \"\"" Mar 16 00:13:24 crc kubenswrapper[4816]: I0316 00:13:24.884466 4816 reconciler_common.go:293] "Volume detached for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") on node \"crc\" DevicePath \"\"" Mar 16 00:13:24 crc kubenswrapper[4816]: I0316 00:13:24.884476 4816 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") on node \"crc\" DevicePath \"\"" Mar 16 00:13:24 crc kubenswrapper[4816]: I0316 00:13:24.884490 4816 reconciler_common.go:293] "Volume detached for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") on node \"crc\" DevicePath \"\"" Mar 16 00:13:25 crc kubenswrapper[4816]: I0316 00:13:25.608478 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Mar 16 00:13:25 crc kubenswrapper[4816]: I0316 00:13:25.673608 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" path="/var/lib/kubelet/pods/f85e55b1a89d02b0cb034b1ea31ed45a/volumes" Mar 16 00:13:25 crc kubenswrapper[4816]: I0316 00:13:25.673859 4816 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podUID="" Mar 16 00:13:25 crc kubenswrapper[4816]: I0316 00:13:25.684226 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Mar 16 00:13:25 crc kubenswrapper[4816]: I0316 00:13:25.684272 4816 kubelet.go:2649] "Unable to find pod for mirror pod, skipping" mirrorPod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" mirrorPodUID="9aaafc3b-fd98-4f5e-9553-1c09a26bfc4e" Mar 16 00:13:25 crc kubenswrapper[4816]: I0316 00:13:25.687698 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Mar 16 00:13:25 crc kubenswrapper[4816]: I0316 00:13:25.687738 4816 kubelet.go:2673] "Unable to find pod for mirror pod, skipping" mirrorPod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" mirrorPodUID="9aaafc3b-fd98-4f5e-9553-1c09a26bfc4e" Mar 16 00:13:26 crc kubenswrapper[4816]: I0316 00:13:26.007919 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Mar 16 00:13:28 crc kubenswrapper[4816]: I0316 00:13:28.444252 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Mar 16 00:13:29 crc kubenswrapper[4816]: I0316 00:13:29.052953 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Mar 16 00:13:29 crc kubenswrapper[4816]: I0316 00:13:29.298583 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Mar 16 00:13:32 crc kubenswrapper[4816]: I0316 00:13:32.672788 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Mar 16 00:13:35 crc kubenswrapper[4816]: I0316 00:13:35.360021 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Mar 16 00:13:35 crc kubenswrapper[4816]: I0316 00:13:35.458947 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Mar 16 00:13:40 crc kubenswrapper[4816]: I0316 00:13:40.788219 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Mar 16 00:13:41 crc kubenswrapper[4816]: I0316 00:13:41.928810 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-57f9cb4856-kxssx"] Mar 16 00:13:41 crc kubenswrapper[4816]: I0316 00:13:41.929848 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-57f9cb4856-kxssx" podUID="9bbefecc-c220-48cf-b69a-61571c8ad48f" containerName="controller-manager" containerID="cri-o://f4e58123a4dbd0b4ef4c934219a91e7f9e88f653dec47627ad6279cb525f5a3c" gracePeriod=30 Mar 16 00:13:41 crc kubenswrapper[4816]: I0316 00:13:41.952171 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-647cdcffd4-5z6tn"] Mar 16 00:13:41 crc kubenswrapper[4816]: I0316 00:13:41.952598 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-647cdcffd4-5z6tn" podUID="6fba9f52-b53b-4cbf-8bfb-0fe0938048c4" containerName="route-controller-manager" containerID="cri-o://f8227ce649fe9ebfcf381abbf843e9fc12467efbe2942b266524e1c1a986c8ec" gracePeriod=30 Mar 16 00:13:42 crc kubenswrapper[4816]: I0316 00:13:42.440007 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-647cdcffd4-5z6tn" Mar 16 00:13:42 crc kubenswrapper[4816]: I0316 00:13:42.505448 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bt8nr\" (UniqueName: \"kubernetes.io/projected/6fba9f52-b53b-4cbf-8bfb-0fe0938048c4-kube-api-access-bt8nr\") pod \"6fba9f52-b53b-4cbf-8bfb-0fe0938048c4\" (UID: \"6fba9f52-b53b-4cbf-8bfb-0fe0938048c4\") " Mar 16 00:13:42 crc kubenswrapper[4816]: I0316 00:13:42.505518 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6fba9f52-b53b-4cbf-8bfb-0fe0938048c4-serving-cert\") pod \"6fba9f52-b53b-4cbf-8bfb-0fe0938048c4\" (UID: \"6fba9f52-b53b-4cbf-8bfb-0fe0938048c4\") " Mar 16 00:13:42 crc kubenswrapper[4816]: I0316 00:13:42.505619 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6fba9f52-b53b-4cbf-8bfb-0fe0938048c4-config\") pod \"6fba9f52-b53b-4cbf-8bfb-0fe0938048c4\" (UID: \"6fba9f52-b53b-4cbf-8bfb-0fe0938048c4\") " Mar 16 00:13:42 crc kubenswrapper[4816]: I0316 00:13:42.505668 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/6fba9f52-b53b-4cbf-8bfb-0fe0938048c4-client-ca\") pod \"6fba9f52-b53b-4cbf-8bfb-0fe0938048c4\" (UID: \"6fba9f52-b53b-4cbf-8bfb-0fe0938048c4\") " Mar 16 00:13:42 crc kubenswrapper[4816]: I0316 00:13:42.506337 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6fba9f52-b53b-4cbf-8bfb-0fe0938048c4-client-ca" (OuterVolumeSpecName: "client-ca") pod "6fba9f52-b53b-4cbf-8bfb-0fe0938048c4" (UID: "6fba9f52-b53b-4cbf-8bfb-0fe0938048c4"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:13:42 crc kubenswrapper[4816]: I0316 00:13:42.506411 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6fba9f52-b53b-4cbf-8bfb-0fe0938048c4-config" (OuterVolumeSpecName: "config") pod "6fba9f52-b53b-4cbf-8bfb-0fe0938048c4" (UID: "6fba9f52-b53b-4cbf-8bfb-0fe0938048c4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:13:42 crc kubenswrapper[4816]: I0316 00:13:42.512733 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6fba9f52-b53b-4cbf-8bfb-0fe0938048c4-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "6fba9f52-b53b-4cbf-8bfb-0fe0938048c4" (UID: "6fba9f52-b53b-4cbf-8bfb-0fe0938048c4"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 16 00:13:42 crc kubenswrapper[4816]: I0316 00:13:42.525355 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6fba9f52-b53b-4cbf-8bfb-0fe0938048c4-kube-api-access-bt8nr" (OuterVolumeSpecName: "kube-api-access-bt8nr") pod "6fba9f52-b53b-4cbf-8bfb-0fe0938048c4" (UID: "6fba9f52-b53b-4cbf-8bfb-0fe0938048c4"). InnerVolumeSpecName "kube-api-access-bt8nr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:13:42 crc kubenswrapper[4816]: I0316 00:13:42.551894 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-57f9cb4856-kxssx" Mar 16 00:13:42 crc kubenswrapper[4816]: I0316 00:13:42.606567 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xl2vf\" (UniqueName: \"kubernetes.io/projected/9bbefecc-c220-48cf-b69a-61571c8ad48f-kube-api-access-xl2vf\") pod \"9bbefecc-c220-48cf-b69a-61571c8ad48f\" (UID: \"9bbefecc-c220-48cf-b69a-61571c8ad48f\") " Mar 16 00:13:42 crc kubenswrapper[4816]: I0316 00:13:42.606613 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9bbefecc-c220-48cf-b69a-61571c8ad48f-config\") pod \"9bbefecc-c220-48cf-b69a-61571c8ad48f\" (UID: \"9bbefecc-c220-48cf-b69a-61571c8ad48f\") " Mar 16 00:13:42 crc kubenswrapper[4816]: I0316 00:13:42.606639 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9bbefecc-c220-48cf-b69a-61571c8ad48f-serving-cert\") pod \"9bbefecc-c220-48cf-b69a-61571c8ad48f\" (UID: \"9bbefecc-c220-48cf-b69a-61571c8ad48f\") " Mar 16 00:13:42 crc kubenswrapper[4816]: I0316 00:13:42.606703 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/9bbefecc-c220-48cf-b69a-61571c8ad48f-proxy-ca-bundles\") pod \"9bbefecc-c220-48cf-b69a-61571c8ad48f\" (UID: \"9bbefecc-c220-48cf-b69a-61571c8ad48f\") " Mar 16 00:13:42 crc kubenswrapper[4816]: I0316 00:13:42.606719 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9bbefecc-c220-48cf-b69a-61571c8ad48f-client-ca\") pod \"9bbefecc-c220-48cf-b69a-61571c8ad48f\" (UID: \"9bbefecc-c220-48cf-b69a-61571c8ad48f\") " Mar 16 00:13:42 crc kubenswrapper[4816]: I0316 00:13:42.607413 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9bbefecc-c220-48cf-b69a-61571c8ad48f-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "9bbefecc-c220-48cf-b69a-61571c8ad48f" (UID: "9bbefecc-c220-48cf-b69a-61571c8ad48f"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:13:42 crc kubenswrapper[4816]: I0316 00:13:42.607427 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9bbefecc-c220-48cf-b69a-61571c8ad48f-client-ca" (OuterVolumeSpecName: "client-ca") pod "9bbefecc-c220-48cf-b69a-61571c8ad48f" (UID: "9bbefecc-c220-48cf-b69a-61571c8ad48f"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:13:42 crc kubenswrapper[4816]: I0316 00:13:42.607478 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9bbefecc-c220-48cf-b69a-61571c8ad48f-config" (OuterVolumeSpecName: "config") pod "9bbefecc-c220-48cf-b69a-61571c8ad48f" (UID: "9bbefecc-c220-48cf-b69a-61571c8ad48f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:13:42 crc kubenswrapper[4816]: I0316 00:13:42.607672 4816 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9bbefecc-c220-48cf-b69a-61571c8ad48f-config\") on node \"crc\" DevicePath \"\"" Mar 16 00:13:42 crc kubenswrapper[4816]: I0316 00:13:42.607694 4816 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6fba9f52-b53b-4cbf-8bfb-0fe0938048c4-config\") on node \"crc\" DevicePath \"\"" Mar 16 00:13:42 crc kubenswrapper[4816]: I0316 00:13:42.607706 4816 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/6fba9f52-b53b-4cbf-8bfb-0fe0938048c4-client-ca\") on node \"crc\" DevicePath \"\"" Mar 16 00:13:42 crc kubenswrapper[4816]: I0316 00:13:42.607717 4816 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9bbefecc-c220-48cf-b69a-61571c8ad48f-client-ca\") on node \"crc\" DevicePath \"\"" Mar 16 00:13:42 crc kubenswrapper[4816]: I0316 00:13:42.607727 4816 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/9bbefecc-c220-48cf-b69a-61571c8ad48f-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 16 00:13:42 crc kubenswrapper[4816]: I0316 00:13:42.607738 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bt8nr\" (UniqueName: \"kubernetes.io/projected/6fba9f52-b53b-4cbf-8bfb-0fe0938048c4-kube-api-access-bt8nr\") on node \"crc\" DevicePath \"\"" Mar 16 00:13:42 crc kubenswrapper[4816]: I0316 00:13:42.607748 4816 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6fba9f52-b53b-4cbf-8bfb-0fe0938048c4-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 16 00:13:42 crc kubenswrapper[4816]: I0316 00:13:42.610445 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9bbefecc-c220-48cf-b69a-61571c8ad48f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "9bbefecc-c220-48cf-b69a-61571c8ad48f" (UID: "9bbefecc-c220-48cf-b69a-61571c8ad48f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 16 00:13:42 crc kubenswrapper[4816]: I0316 00:13:42.614491 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Mar 16 00:13:42 crc kubenswrapper[4816]: I0316 00:13:42.615324 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9bbefecc-c220-48cf-b69a-61571c8ad48f-kube-api-access-xl2vf" (OuterVolumeSpecName: "kube-api-access-xl2vf") pod "9bbefecc-c220-48cf-b69a-61571c8ad48f" (UID: "9bbefecc-c220-48cf-b69a-61571c8ad48f"). InnerVolumeSpecName "kube-api-access-xl2vf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:13:42 crc kubenswrapper[4816]: I0316 00:13:42.709340 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xl2vf\" (UniqueName: \"kubernetes.io/projected/9bbefecc-c220-48cf-b69a-61571c8ad48f-kube-api-access-xl2vf\") on node \"crc\" DevicePath \"\"" Mar 16 00:13:42 crc kubenswrapper[4816]: I0316 00:13:42.709379 4816 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9bbefecc-c220-48cf-b69a-61571c8ad48f-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 16 00:13:42 crc kubenswrapper[4816]: I0316 00:13:42.764923 4816 generic.go:334] "Generic (PLEG): container finished" podID="9bbefecc-c220-48cf-b69a-61571c8ad48f" containerID="f4e58123a4dbd0b4ef4c934219a91e7f9e88f653dec47627ad6279cb525f5a3c" exitCode=0 Mar 16 00:13:42 crc kubenswrapper[4816]: I0316 00:13:42.765010 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-57f9cb4856-kxssx" event={"ID":"9bbefecc-c220-48cf-b69a-61571c8ad48f","Type":"ContainerDied","Data":"f4e58123a4dbd0b4ef4c934219a91e7f9e88f653dec47627ad6279cb525f5a3c"} Mar 16 00:13:42 crc kubenswrapper[4816]: I0316 00:13:42.765043 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-57f9cb4856-kxssx" event={"ID":"9bbefecc-c220-48cf-b69a-61571c8ad48f","Type":"ContainerDied","Data":"03fd2c051d734f0a0c50d6dc9512f0c74423439d3b3fa7ab8edc636db3d5dc09"} Mar 16 00:13:42 crc kubenswrapper[4816]: I0316 00:13:42.765063 4816 scope.go:117] "RemoveContainer" containerID="f4e58123a4dbd0b4ef4c934219a91e7f9e88f653dec47627ad6279cb525f5a3c" Mar 16 00:13:42 crc kubenswrapper[4816]: I0316 00:13:42.765203 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-57f9cb4856-kxssx" Mar 16 00:13:42 crc kubenswrapper[4816]: I0316 00:13:42.772642 4816 generic.go:334] "Generic (PLEG): container finished" podID="6fba9f52-b53b-4cbf-8bfb-0fe0938048c4" containerID="f8227ce649fe9ebfcf381abbf843e9fc12467efbe2942b266524e1c1a986c8ec" exitCode=0 Mar 16 00:13:42 crc kubenswrapper[4816]: I0316 00:13:42.772897 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-647cdcffd4-5z6tn" event={"ID":"6fba9f52-b53b-4cbf-8bfb-0fe0938048c4","Type":"ContainerDied","Data":"f8227ce649fe9ebfcf381abbf843e9fc12467efbe2942b266524e1c1a986c8ec"} Mar 16 00:13:42 crc kubenswrapper[4816]: I0316 00:13:42.772924 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-647cdcffd4-5z6tn" event={"ID":"6fba9f52-b53b-4cbf-8bfb-0fe0938048c4","Type":"ContainerDied","Data":"0a59117cef9e6d9e6822a03d57d6d20cce4fe9b81cd544f8d6df885b7a59aaf4"} Mar 16 00:13:42 crc kubenswrapper[4816]: I0316 00:13:42.773011 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-647cdcffd4-5z6tn" Mar 16 00:13:42 crc kubenswrapper[4816]: I0316 00:13:42.785404 4816 scope.go:117] "RemoveContainer" containerID="f4e58123a4dbd0b4ef4c934219a91e7f9e88f653dec47627ad6279cb525f5a3c" Mar 16 00:13:42 crc kubenswrapper[4816]: E0316 00:13:42.785909 4816 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f4e58123a4dbd0b4ef4c934219a91e7f9e88f653dec47627ad6279cb525f5a3c\": container with ID starting with f4e58123a4dbd0b4ef4c934219a91e7f9e88f653dec47627ad6279cb525f5a3c not found: ID does not exist" containerID="f4e58123a4dbd0b4ef4c934219a91e7f9e88f653dec47627ad6279cb525f5a3c" Mar 16 00:13:42 crc kubenswrapper[4816]: I0316 00:13:42.785960 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f4e58123a4dbd0b4ef4c934219a91e7f9e88f653dec47627ad6279cb525f5a3c"} err="failed to get container status \"f4e58123a4dbd0b4ef4c934219a91e7f9e88f653dec47627ad6279cb525f5a3c\": rpc error: code = NotFound desc = could not find container \"f4e58123a4dbd0b4ef4c934219a91e7f9e88f653dec47627ad6279cb525f5a3c\": container with ID starting with f4e58123a4dbd0b4ef4c934219a91e7f9e88f653dec47627ad6279cb525f5a3c not found: ID does not exist" Mar 16 00:13:42 crc kubenswrapper[4816]: I0316 00:13:42.786189 4816 scope.go:117] "RemoveContainer" containerID="f8227ce649fe9ebfcf381abbf843e9fc12467efbe2942b266524e1c1a986c8ec" Mar 16 00:13:42 crc kubenswrapper[4816]: I0316 00:13:42.800668 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-57f9cb4856-kxssx"] Mar 16 00:13:42 crc kubenswrapper[4816]: I0316 00:13:42.804059 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-57f9cb4856-kxssx"] Mar 16 00:13:42 crc kubenswrapper[4816]: I0316 00:13:42.819695 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-647cdcffd4-5z6tn"] Mar 16 00:13:42 crc kubenswrapper[4816]: I0316 00:13:42.822049 4816 scope.go:117] "RemoveContainer" containerID="f8227ce649fe9ebfcf381abbf843e9fc12467efbe2942b266524e1c1a986c8ec" Mar 16 00:13:42 crc kubenswrapper[4816]: E0316 00:13:42.822533 4816 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f8227ce649fe9ebfcf381abbf843e9fc12467efbe2942b266524e1c1a986c8ec\": container with ID starting with f8227ce649fe9ebfcf381abbf843e9fc12467efbe2942b266524e1c1a986c8ec not found: ID does not exist" containerID="f8227ce649fe9ebfcf381abbf843e9fc12467efbe2942b266524e1c1a986c8ec" Mar 16 00:13:42 crc kubenswrapper[4816]: I0316 00:13:42.822692 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f8227ce649fe9ebfcf381abbf843e9fc12467efbe2942b266524e1c1a986c8ec"} err="failed to get container status \"f8227ce649fe9ebfcf381abbf843e9fc12467efbe2942b266524e1c1a986c8ec\": rpc error: code = NotFound desc = could not find container \"f8227ce649fe9ebfcf381abbf843e9fc12467efbe2942b266524e1c1a986c8ec\": container with ID starting with f8227ce649fe9ebfcf381abbf843e9fc12467efbe2942b266524e1c1a986c8ec not found: ID does not exist" Mar 16 00:13:42 crc kubenswrapper[4816]: I0316 00:13:42.825548 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-647cdcffd4-5z6tn"] Mar 16 00:13:43 crc kubenswrapper[4816]: I0316 00:13:43.501723 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-69d5f8f747-vqm27"] Mar 16 00:13:43 crc kubenswrapper[4816]: E0316 00:13:43.502001 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6fba9f52-b53b-4cbf-8bfb-0fe0938048c4" containerName="route-controller-manager" Mar 16 00:13:43 crc kubenswrapper[4816]: I0316 00:13:43.502014 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="6fba9f52-b53b-4cbf-8bfb-0fe0938048c4" containerName="route-controller-manager" Mar 16 00:13:43 crc kubenswrapper[4816]: E0316 00:13:43.502029 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9bbefecc-c220-48cf-b69a-61571c8ad48f" containerName="controller-manager" Mar 16 00:13:43 crc kubenswrapper[4816]: I0316 00:13:43.502035 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="9bbefecc-c220-48cf-b69a-61571c8ad48f" containerName="controller-manager" Mar 16 00:13:43 crc kubenswrapper[4816]: I0316 00:13:43.502154 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="6fba9f52-b53b-4cbf-8bfb-0fe0938048c4" containerName="route-controller-manager" Mar 16 00:13:43 crc kubenswrapper[4816]: I0316 00:13:43.502167 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="9bbefecc-c220-48cf-b69a-61571c8ad48f" containerName="controller-manager" Mar 16 00:13:43 crc kubenswrapper[4816]: I0316 00:13:43.502618 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-69d5f8f747-vqm27" Mar 16 00:13:43 crc kubenswrapper[4816]: I0316 00:13:43.503584 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-7b94474fcc-vzgw6"] Mar 16 00:13:43 crc kubenswrapper[4816]: I0316 00:13:43.504298 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7b94474fcc-vzgw6" Mar 16 00:13:43 crc kubenswrapper[4816]: I0316 00:13:43.506762 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 16 00:13:43 crc kubenswrapper[4816]: I0316 00:13:43.506798 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 16 00:13:43 crc kubenswrapper[4816]: I0316 00:13:43.509376 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 16 00:13:43 crc kubenswrapper[4816]: I0316 00:13:43.510987 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 16 00:13:43 crc kubenswrapper[4816]: I0316 00:13:43.510996 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 16 00:13:43 crc kubenswrapper[4816]: I0316 00:13:43.511012 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 16 00:13:43 crc kubenswrapper[4816]: I0316 00:13:43.510993 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 16 00:13:43 crc kubenswrapper[4816]: I0316 00:13:43.511209 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 16 00:13:43 crc kubenswrapper[4816]: I0316 00:13:43.513713 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 16 00:13:43 crc kubenswrapper[4816]: I0316 00:13:43.514108 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 16 00:13:43 crc kubenswrapper[4816]: I0316 00:13:43.514139 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 16 00:13:43 crc kubenswrapper[4816]: I0316 00:13:43.514259 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 16 00:13:43 crc kubenswrapper[4816]: I0316 00:13:43.518166 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-7b94474fcc-vzgw6"] Mar 16 00:13:43 crc kubenswrapper[4816]: I0316 00:13:43.519504 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/14521451-81b6-4214-883a-cd05a9357517-proxy-ca-bundles\") pod \"controller-manager-7b94474fcc-vzgw6\" (UID: \"14521451-81b6-4214-883a-cd05a9357517\") " pod="openshift-controller-manager/controller-manager-7b94474fcc-vzgw6" Mar 16 00:13:43 crc kubenswrapper[4816]: I0316 00:13:43.519566 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/14521451-81b6-4214-883a-cd05a9357517-config\") pod \"controller-manager-7b94474fcc-vzgw6\" (UID: \"14521451-81b6-4214-883a-cd05a9357517\") " pod="openshift-controller-manager/controller-manager-7b94474fcc-vzgw6" Mar 16 00:13:43 crc kubenswrapper[4816]: I0316 00:13:43.519684 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-khgb4\" (UniqueName: \"kubernetes.io/projected/14521451-81b6-4214-883a-cd05a9357517-kube-api-access-khgb4\") pod \"controller-manager-7b94474fcc-vzgw6\" (UID: \"14521451-81b6-4214-883a-cd05a9357517\") " pod="openshift-controller-manager/controller-manager-7b94474fcc-vzgw6" Mar 16 00:13:43 crc kubenswrapper[4816]: I0316 00:13:43.519778 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2ab40c56-2ac9-49f5-8370-ff1ae7c5757f-client-ca\") pod \"route-controller-manager-69d5f8f747-vqm27\" (UID: \"2ab40c56-2ac9-49f5-8370-ff1ae7c5757f\") " pod="openshift-route-controller-manager/route-controller-manager-69d5f8f747-vqm27" Mar 16 00:13:43 crc kubenswrapper[4816]: I0316 00:13:43.519802 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/14521451-81b6-4214-883a-cd05a9357517-client-ca\") pod \"controller-manager-7b94474fcc-vzgw6\" (UID: \"14521451-81b6-4214-883a-cd05a9357517\") " pod="openshift-controller-manager/controller-manager-7b94474fcc-vzgw6" Mar 16 00:13:43 crc kubenswrapper[4816]: I0316 00:13:43.519864 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2ab40c56-2ac9-49f5-8370-ff1ae7c5757f-serving-cert\") pod \"route-controller-manager-69d5f8f747-vqm27\" (UID: \"2ab40c56-2ac9-49f5-8370-ff1ae7c5757f\") " pod="openshift-route-controller-manager/route-controller-manager-69d5f8f747-vqm27" Mar 16 00:13:43 crc kubenswrapper[4816]: I0316 00:13:43.519904 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/14521451-81b6-4214-883a-cd05a9357517-serving-cert\") pod \"controller-manager-7b94474fcc-vzgw6\" (UID: \"14521451-81b6-4214-883a-cd05a9357517\") " pod="openshift-controller-manager/controller-manager-7b94474fcc-vzgw6" Mar 16 00:13:43 crc kubenswrapper[4816]: I0316 00:13:43.519947 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8tf9t\" (UniqueName: \"kubernetes.io/projected/2ab40c56-2ac9-49f5-8370-ff1ae7c5757f-kube-api-access-8tf9t\") pod \"route-controller-manager-69d5f8f747-vqm27\" (UID: \"2ab40c56-2ac9-49f5-8370-ff1ae7c5757f\") " pod="openshift-route-controller-manager/route-controller-manager-69d5f8f747-vqm27" Mar 16 00:13:43 crc kubenswrapper[4816]: I0316 00:13:43.519973 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2ab40c56-2ac9-49f5-8370-ff1ae7c5757f-config\") pod \"route-controller-manager-69d5f8f747-vqm27\" (UID: \"2ab40c56-2ac9-49f5-8370-ff1ae7c5757f\") " pod="openshift-route-controller-manager/route-controller-manager-69d5f8f747-vqm27" Mar 16 00:13:43 crc kubenswrapper[4816]: I0316 00:13:43.520044 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 16 00:13:43 crc kubenswrapper[4816]: I0316 00:13:43.522332 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-69d5f8f747-vqm27"] Mar 16 00:13:43 crc kubenswrapper[4816]: I0316 00:13:43.621746 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2ab40c56-2ac9-49f5-8370-ff1ae7c5757f-config\") pod \"route-controller-manager-69d5f8f747-vqm27\" (UID: \"2ab40c56-2ac9-49f5-8370-ff1ae7c5757f\") " pod="openshift-route-controller-manager/route-controller-manager-69d5f8f747-vqm27" Mar 16 00:13:43 crc kubenswrapper[4816]: I0316 00:13:43.621813 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/14521451-81b6-4214-883a-cd05a9357517-proxy-ca-bundles\") pod \"controller-manager-7b94474fcc-vzgw6\" (UID: \"14521451-81b6-4214-883a-cd05a9357517\") " pod="openshift-controller-manager/controller-manager-7b94474fcc-vzgw6" Mar 16 00:13:43 crc kubenswrapper[4816]: I0316 00:13:43.621853 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/14521451-81b6-4214-883a-cd05a9357517-config\") pod \"controller-manager-7b94474fcc-vzgw6\" (UID: \"14521451-81b6-4214-883a-cd05a9357517\") " pod="openshift-controller-manager/controller-manager-7b94474fcc-vzgw6" Mar 16 00:13:43 crc kubenswrapper[4816]: I0316 00:13:43.621902 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-khgb4\" (UniqueName: \"kubernetes.io/projected/14521451-81b6-4214-883a-cd05a9357517-kube-api-access-khgb4\") pod \"controller-manager-7b94474fcc-vzgw6\" (UID: \"14521451-81b6-4214-883a-cd05a9357517\") " pod="openshift-controller-manager/controller-manager-7b94474fcc-vzgw6" Mar 16 00:13:43 crc kubenswrapper[4816]: I0316 00:13:43.621938 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2ab40c56-2ac9-49f5-8370-ff1ae7c5757f-client-ca\") pod \"route-controller-manager-69d5f8f747-vqm27\" (UID: \"2ab40c56-2ac9-49f5-8370-ff1ae7c5757f\") " pod="openshift-route-controller-manager/route-controller-manager-69d5f8f747-vqm27" Mar 16 00:13:43 crc kubenswrapper[4816]: I0316 00:13:43.621968 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/14521451-81b6-4214-883a-cd05a9357517-client-ca\") pod \"controller-manager-7b94474fcc-vzgw6\" (UID: \"14521451-81b6-4214-883a-cd05a9357517\") " pod="openshift-controller-manager/controller-manager-7b94474fcc-vzgw6" Mar 16 00:13:43 crc kubenswrapper[4816]: I0316 00:13:43.621996 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2ab40c56-2ac9-49f5-8370-ff1ae7c5757f-serving-cert\") pod \"route-controller-manager-69d5f8f747-vqm27\" (UID: \"2ab40c56-2ac9-49f5-8370-ff1ae7c5757f\") " pod="openshift-route-controller-manager/route-controller-manager-69d5f8f747-vqm27" Mar 16 00:13:43 crc kubenswrapper[4816]: I0316 00:13:43.622016 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/14521451-81b6-4214-883a-cd05a9357517-serving-cert\") pod \"controller-manager-7b94474fcc-vzgw6\" (UID: \"14521451-81b6-4214-883a-cd05a9357517\") " pod="openshift-controller-manager/controller-manager-7b94474fcc-vzgw6" Mar 16 00:13:43 crc kubenswrapper[4816]: I0316 00:13:43.622043 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8tf9t\" (UniqueName: \"kubernetes.io/projected/2ab40c56-2ac9-49f5-8370-ff1ae7c5757f-kube-api-access-8tf9t\") pod \"route-controller-manager-69d5f8f747-vqm27\" (UID: \"2ab40c56-2ac9-49f5-8370-ff1ae7c5757f\") " pod="openshift-route-controller-manager/route-controller-manager-69d5f8f747-vqm27" Mar 16 00:13:43 crc kubenswrapper[4816]: I0316 00:13:43.623055 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/14521451-81b6-4214-883a-cd05a9357517-proxy-ca-bundles\") pod \"controller-manager-7b94474fcc-vzgw6\" (UID: \"14521451-81b6-4214-883a-cd05a9357517\") " pod="openshift-controller-manager/controller-manager-7b94474fcc-vzgw6" Mar 16 00:13:43 crc kubenswrapper[4816]: I0316 00:13:43.623087 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/14521451-81b6-4214-883a-cd05a9357517-client-ca\") pod \"controller-manager-7b94474fcc-vzgw6\" (UID: \"14521451-81b6-4214-883a-cd05a9357517\") " pod="openshift-controller-manager/controller-manager-7b94474fcc-vzgw6" Mar 16 00:13:43 crc kubenswrapper[4816]: I0316 00:13:43.623132 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2ab40c56-2ac9-49f5-8370-ff1ae7c5757f-config\") pod \"route-controller-manager-69d5f8f747-vqm27\" (UID: \"2ab40c56-2ac9-49f5-8370-ff1ae7c5757f\") " pod="openshift-route-controller-manager/route-controller-manager-69d5f8f747-vqm27" Mar 16 00:13:43 crc kubenswrapper[4816]: I0316 00:13:43.623150 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2ab40c56-2ac9-49f5-8370-ff1ae7c5757f-client-ca\") pod \"route-controller-manager-69d5f8f747-vqm27\" (UID: \"2ab40c56-2ac9-49f5-8370-ff1ae7c5757f\") " pod="openshift-route-controller-manager/route-controller-manager-69d5f8f747-vqm27" Mar 16 00:13:43 crc kubenswrapper[4816]: I0316 00:13:43.624094 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/14521451-81b6-4214-883a-cd05a9357517-config\") pod \"controller-manager-7b94474fcc-vzgw6\" (UID: \"14521451-81b6-4214-883a-cd05a9357517\") " pod="openshift-controller-manager/controller-manager-7b94474fcc-vzgw6" Mar 16 00:13:43 crc kubenswrapper[4816]: I0316 00:13:43.632663 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2ab40c56-2ac9-49f5-8370-ff1ae7c5757f-serving-cert\") pod \"route-controller-manager-69d5f8f747-vqm27\" (UID: \"2ab40c56-2ac9-49f5-8370-ff1ae7c5757f\") " pod="openshift-route-controller-manager/route-controller-manager-69d5f8f747-vqm27" Mar 16 00:13:43 crc kubenswrapper[4816]: I0316 00:13:43.634144 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/14521451-81b6-4214-883a-cd05a9357517-serving-cert\") pod \"controller-manager-7b94474fcc-vzgw6\" (UID: \"14521451-81b6-4214-883a-cd05a9357517\") " pod="openshift-controller-manager/controller-manager-7b94474fcc-vzgw6" Mar 16 00:13:43 crc kubenswrapper[4816]: I0316 00:13:43.640269 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8tf9t\" (UniqueName: \"kubernetes.io/projected/2ab40c56-2ac9-49f5-8370-ff1ae7c5757f-kube-api-access-8tf9t\") pod \"route-controller-manager-69d5f8f747-vqm27\" (UID: \"2ab40c56-2ac9-49f5-8370-ff1ae7c5757f\") " pod="openshift-route-controller-manager/route-controller-manager-69d5f8f747-vqm27" Mar 16 00:13:43 crc kubenswrapper[4816]: I0316 00:13:43.640468 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-khgb4\" (UniqueName: \"kubernetes.io/projected/14521451-81b6-4214-883a-cd05a9357517-kube-api-access-khgb4\") pod \"controller-manager-7b94474fcc-vzgw6\" (UID: \"14521451-81b6-4214-883a-cd05a9357517\") " pod="openshift-controller-manager/controller-manager-7b94474fcc-vzgw6" Mar 16 00:13:43 crc kubenswrapper[4816]: I0316 00:13:43.673766 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6fba9f52-b53b-4cbf-8bfb-0fe0938048c4" path="/var/lib/kubelet/pods/6fba9f52-b53b-4cbf-8bfb-0fe0938048c4/volumes" Mar 16 00:13:43 crc kubenswrapper[4816]: I0316 00:13:43.674327 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9bbefecc-c220-48cf-b69a-61571c8ad48f" path="/var/lib/kubelet/pods/9bbefecc-c220-48cf-b69a-61571c8ad48f/volumes" Mar 16 00:13:43 crc kubenswrapper[4816]: I0316 00:13:43.822018 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-69d5f8f747-vqm27" Mar 16 00:13:43 crc kubenswrapper[4816]: I0316 00:13:43.845615 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7b94474fcc-vzgw6" Mar 16 00:13:44 crc kubenswrapper[4816]: I0316 00:13:44.068148 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-7b94474fcc-vzgw6"] Mar 16 00:13:44 crc kubenswrapper[4816]: I0316 00:13:44.235817 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-69d5f8f747-vqm27"] Mar 16 00:13:44 crc kubenswrapper[4816]: W0316 00:13:44.240738 4816 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2ab40c56_2ac9_49f5_8370_ff1ae7c5757f.slice/crio-916114dd72baadc2e3d1c4e882df4092bba32ea74a936ec4e52471a9ade09699 WatchSource:0}: Error finding container 916114dd72baadc2e3d1c4e882df4092bba32ea74a936ec4e52471a9ade09699: Status 404 returned error can't find the container with id 916114dd72baadc2e3d1c4e882df4092bba32ea74a936ec4e52471a9ade09699 Mar 16 00:13:44 crc kubenswrapper[4816]: I0316 00:13:44.625402 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-hvpqn"] Mar 16 00:13:44 crc kubenswrapper[4816]: I0316 00:13:44.625983 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-hvpqn" podUID="cc1ea93d-1cf8-4145-ad35-83f2d1357f9d" containerName="registry-server" containerID="cri-o://8f95ead769819114b5324ad74b013a299738b066a23d9b7aab0526d5b3f15f3a" gracePeriod=2 Mar 16 00:13:44 crc kubenswrapper[4816]: I0316 00:13:44.786902 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7b94474fcc-vzgw6" event={"ID":"14521451-81b6-4214-883a-cd05a9357517","Type":"ContainerStarted","Data":"83a4b2018bae2ed06df319b69f24a3f9e5ae7f538f9077ab4ab4efca28173ee4"} Mar 16 00:13:44 crc kubenswrapper[4816]: I0316 00:13:44.786947 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7b94474fcc-vzgw6" event={"ID":"14521451-81b6-4214-883a-cd05a9357517","Type":"ContainerStarted","Data":"7fd72c6bda85d63e3016f0f126428c17855b91cdeed58a7acc5610da9342d4f1"} Mar 16 00:13:44 crc kubenswrapper[4816]: I0316 00:13:44.787184 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-7b94474fcc-vzgw6" Mar 16 00:13:44 crc kubenswrapper[4816]: I0316 00:13:44.789252 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-69d5f8f747-vqm27" event={"ID":"2ab40c56-2ac9-49f5-8370-ff1ae7c5757f","Type":"ContainerStarted","Data":"6fb0b9447a2d26404d4ce546e776345ac9036f9ccc4aa0b57f561efbdf3d4e9a"} Mar 16 00:13:44 crc kubenswrapper[4816]: I0316 00:13:44.789294 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-69d5f8f747-vqm27" event={"ID":"2ab40c56-2ac9-49f5-8370-ff1ae7c5757f","Type":"ContainerStarted","Data":"916114dd72baadc2e3d1c4e882df4092bba32ea74a936ec4e52471a9ade09699"} Mar 16 00:13:44 crc kubenswrapper[4816]: I0316 00:13:44.789407 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-69d5f8f747-vqm27" Mar 16 00:13:44 crc kubenswrapper[4816]: I0316 00:13:44.791133 4816 generic.go:334] "Generic (PLEG): container finished" podID="cc1ea93d-1cf8-4145-ad35-83f2d1357f9d" containerID="8f95ead769819114b5324ad74b013a299738b066a23d9b7aab0526d5b3f15f3a" exitCode=0 Mar 16 00:13:44 crc kubenswrapper[4816]: I0316 00:13:44.791158 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hvpqn" event={"ID":"cc1ea93d-1cf8-4145-ad35-83f2d1357f9d","Type":"ContainerDied","Data":"8f95ead769819114b5324ad74b013a299738b066a23d9b7aab0526d5b3f15f3a"} Mar 16 00:13:44 crc kubenswrapper[4816]: I0316 00:13:44.793311 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-7b94474fcc-vzgw6" Mar 16 00:13:44 crc kubenswrapper[4816]: I0316 00:13:44.808199 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-7b94474fcc-vzgw6" podStartSLOduration=3.808180665 podStartE2EDuration="3.808180665s" podCreationTimestamp="2026-03-16 00:13:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-16 00:13:44.804960631 +0000 UTC m=+417.901260584" watchObservedRunningTime="2026-03-16 00:13:44.808180665 +0000 UTC m=+417.904480618" Mar 16 00:13:44 crc kubenswrapper[4816]: I0316 00:13:44.968672 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-69d5f8f747-vqm27" Mar 16 00:13:44 crc kubenswrapper[4816]: I0316 00:13:44.998106 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-69d5f8f747-vqm27" podStartSLOduration=3.9980854409999997 podStartE2EDuration="3.998085441s" podCreationTimestamp="2026-03-16 00:13:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-16 00:13:44.845992929 +0000 UTC m=+417.942292882" watchObservedRunningTime="2026-03-16 00:13:44.998085441 +0000 UTC m=+418.094385394" Mar 16 00:13:45 crc kubenswrapper[4816]: I0316 00:13:45.132703 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-hvpqn" Mar 16 00:13:45 crc kubenswrapper[4816]: I0316 00:13:45.242912 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Mar 16 00:13:45 crc kubenswrapper[4816]: I0316 00:13:45.248610 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cc1ea93d-1cf8-4145-ad35-83f2d1357f9d-utilities\") pod \"cc1ea93d-1cf8-4145-ad35-83f2d1357f9d\" (UID: \"cc1ea93d-1cf8-4145-ad35-83f2d1357f9d\") " Mar 16 00:13:45 crc kubenswrapper[4816]: I0316 00:13:45.248674 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cc1ea93d-1cf8-4145-ad35-83f2d1357f9d-catalog-content\") pod \"cc1ea93d-1cf8-4145-ad35-83f2d1357f9d\" (UID: \"cc1ea93d-1cf8-4145-ad35-83f2d1357f9d\") " Mar 16 00:13:45 crc kubenswrapper[4816]: I0316 00:13:45.248733 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qk7j9\" (UniqueName: \"kubernetes.io/projected/cc1ea93d-1cf8-4145-ad35-83f2d1357f9d-kube-api-access-qk7j9\") pod \"cc1ea93d-1cf8-4145-ad35-83f2d1357f9d\" (UID: \"cc1ea93d-1cf8-4145-ad35-83f2d1357f9d\") " Mar 16 00:13:45 crc kubenswrapper[4816]: I0316 00:13:45.249582 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cc1ea93d-1cf8-4145-ad35-83f2d1357f9d-utilities" (OuterVolumeSpecName: "utilities") pod "cc1ea93d-1cf8-4145-ad35-83f2d1357f9d" (UID: "cc1ea93d-1cf8-4145-ad35-83f2d1357f9d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 16 00:13:45 crc kubenswrapper[4816]: I0316 00:13:45.256402 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cc1ea93d-1cf8-4145-ad35-83f2d1357f9d-kube-api-access-qk7j9" (OuterVolumeSpecName: "kube-api-access-qk7j9") pod "cc1ea93d-1cf8-4145-ad35-83f2d1357f9d" (UID: "cc1ea93d-1cf8-4145-ad35-83f2d1357f9d"). InnerVolumeSpecName "kube-api-access-qk7j9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:13:45 crc kubenswrapper[4816]: I0316 00:13:45.350533 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qk7j9\" (UniqueName: \"kubernetes.io/projected/cc1ea93d-1cf8-4145-ad35-83f2d1357f9d-kube-api-access-qk7j9\") on node \"crc\" DevicePath \"\"" Mar 16 00:13:45 crc kubenswrapper[4816]: I0316 00:13:45.350580 4816 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cc1ea93d-1cf8-4145-ad35-83f2d1357f9d-utilities\") on node \"crc\" DevicePath \"\"" Mar 16 00:13:45 crc kubenswrapper[4816]: I0316 00:13:45.383233 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cc1ea93d-1cf8-4145-ad35-83f2d1357f9d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "cc1ea93d-1cf8-4145-ad35-83f2d1357f9d" (UID: "cc1ea93d-1cf8-4145-ad35-83f2d1357f9d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 16 00:13:45 crc kubenswrapper[4816]: I0316 00:13:45.451417 4816 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cc1ea93d-1cf8-4145-ad35-83f2d1357f9d-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 16 00:13:45 crc kubenswrapper[4816]: I0316 00:13:45.797311 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hvpqn" event={"ID":"cc1ea93d-1cf8-4145-ad35-83f2d1357f9d","Type":"ContainerDied","Data":"a7d840d19860a5867af8d4206630041069552968b0c74710a21974d2b8f8f661"} Mar 16 00:13:45 crc kubenswrapper[4816]: I0316 00:13:45.797952 4816 scope.go:117] "RemoveContainer" containerID="8f95ead769819114b5324ad74b013a299738b066a23d9b7aab0526d5b3f15f3a" Mar 16 00:13:45 crc kubenswrapper[4816]: I0316 00:13:45.797355 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-hvpqn" Mar 16 00:13:45 crc kubenswrapper[4816]: I0316 00:13:45.812670 4816 scope.go:117] "RemoveContainer" containerID="12e67fc1baf84e28d7eb14a44704825a68bd0357e121983c70625a0778be907a" Mar 16 00:13:45 crc kubenswrapper[4816]: I0316 00:13:45.815816 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-hvpqn"] Mar 16 00:13:45 crc kubenswrapper[4816]: I0316 00:13:45.820085 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-hvpqn"] Mar 16 00:13:45 crc kubenswrapper[4816]: I0316 00:13:45.830034 4816 scope.go:117] "RemoveContainer" containerID="d3d02136defedca51b696822546773a5d6f3e05f0581bc5504bae4a17393efcc" Mar 16 00:13:47 crc kubenswrapper[4816]: I0316 00:13:47.675897 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cc1ea93d-1cf8-4145-ad35-83f2d1357f9d" path="/var/lib/kubelet/pods/cc1ea93d-1cf8-4145-ad35-83f2d1357f9d/volumes" Mar 16 00:14:00 crc kubenswrapper[4816]: I0316 00:14:00.172919 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29560334-7sx8j"] Mar 16 00:14:00 crc kubenswrapper[4816]: E0316 00:14:00.173807 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cc1ea93d-1cf8-4145-ad35-83f2d1357f9d" containerName="registry-server" Mar 16 00:14:00 crc kubenswrapper[4816]: I0316 00:14:00.173824 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="cc1ea93d-1cf8-4145-ad35-83f2d1357f9d" containerName="registry-server" Mar 16 00:14:00 crc kubenswrapper[4816]: E0316 00:14:00.173841 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cc1ea93d-1cf8-4145-ad35-83f2d1357f9d" containerName="extract-utilities" Mar 16 00:14:00 crc kubenswrapper[4816]: I0316 00:14:00.173849 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="cc1ea93d-1cf8-4145-ad35-83f2d1357f9d" containerName="extract-utilities" Mar 16 00:14:00 crc kubenswrapper[4816]: E0316 00:14:00.173861 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cc1ea93d-1cf8-4145-ad35-83f2d1357f9d" containerName="extract-content" Mar 16 00:14:00 crc kubenswrapper[4816]: I0316 00:14:00.173869 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="cc1ea93d-1cf8-4145-ad35-83f2d1357f9d" containerName="extract-content" Mar 16 00:14:00 crc kubenswrapper[4816]: I0316 00:14:00.174005 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="cc1ea93d-1cf8-4145-ad35-83f2d1357f9d" containerName="registry-server" Mar 16 00:14:00 crc kubenswrapper[4816]: I0316 00:14:00.180756 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29560334-7sx8j"] Mar 16 00:14:00 crc kubenswrapper[4816]: I0316 00:14:00.180862 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29560334-7sx8j" Mar 16 00:14:00 crc kubenswrapper[4816]: I0316 00:14:00.184460 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 16 00:14:00 crc kubenswrapper[4816]: I0316 00:14:00.184704 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-8hc2r" Mar 16 00:14:00 crc kubenswrapper[4816]: I0316 00:14:00.184923 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 16 00:14:00 crc kubenswrapper[4816]: I0316 00:14:00.227200 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t5d8q\" (UniqueName: \"kubernetes.io/projected/5160d394-3d9b-4066-9bea-b9dd787b2a42-kube-api-access-t5d8q\") pod \"auto-csr-approver-29560334-7sx8j\" (UID: \"5160d394-3d9b-4066-9bea-b9dd787b2a42\") " pod="openshift-infra/auto-csr-approver-29560334-7sx8j" Mar 16 00:14:00 crc kubenswrapper[4816]: I0316 00:14:00.328324 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t5d8q\" (UniqueName: \"kubernetes.io/projected/5160d394-3d9b-4066-9bea-b9dd787b2a42-kube-api-access-t5d8q\") pod \"auto-csr-approver-29560334-7sx8j\" (UID: \"5160d394-3d9b-4066-9bea-b9dd787b2a42\") " pod="openshift-infra/auto-csr-approver-29560334-7sx8j" Mar 16 00:14:00 crc kubenswrapper[4816]: I0316 00:14:00.360636 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t5d8q\" (UniqueName: \"kubernetes.io/projected/5160d394-3d9b-4066-9bea-b9dd787b2a42-kube-api-access-t5d8q\") pod \"auto-csr-approver-29560334-7sx8j\" (UID: \"5160d394-3d9b-4066-9bea-b9dd787b2a42\") " pod="openshift-infra/auto-csr-approver-29560334-7sx8j" Mar 16 00:14:00 crc kubenswrapper[4816]: I0316 00:14:00.498004 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29560334-7sx8j" Mar 16 00:14:00 crc kubenswrapper[4816]: I0316 00:14:00.913258 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29560334-7sx8j"] Mar 16 00:14:00 crc kubenswrapper[4816]: W0316 00:14:00.924757 4816 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5160d394_3d9b_4066_9bea_b9dd787b2a42.slice/crio-a091429ab10db04195b4dce0908b618f38a120a1d24a9b0b32f5812cce430ee8 WatchSource:0}: Error finding container a091429ab10db04195b4dce0908b618f38a120a1d24a9b0b32f5812cce430ee8: Status 404 returned error can't find the container with id a091429ab10db04195b4dce0908b618f38a120a1d24a9b0b32f5812cce430ee8 Mar 16 00:14:01 crc kubenswrapper[4816]: I0316 00:14:01.887444 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29560334-7sx8j" event={"ID":"5160d394-3d9b-4066-9bea-b9dd787b2a42","Type":"ContainerStarted","Data":"a091429ab10db04195b4dce0908b618f38a120a1d24a9b0b32f5812cce430ee8"} Mar 16 00:14:01 crc kubenswrapper[4816]: I0316 00:14:01.942960 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-7b94474fcc-vzgw6"] Mar 16 00:14:01 crc kubenswrapper[4816]: I0316 00:14:01.943452 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-7b94474fcc-vzgw6" podUID="14521451-81b6-4214-883a-cd05a9357517" containerName="controller-manager" containerID="cri-o://83a4b2018bae2ed06df319b69f24a3f9e5ae7f538f9077ab4ab4efca28173ee4" gracePeriod=30 Mar 16 00:14:02 crc kubenswrapper[4816]: I0316 00:14:02.039974 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-69d5f8f747-vqm27"] Mar 16 00:14:02 crc kubenswrapper[4816]: I0316 00:14:02.040464 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-69d5f8f747-vqm27" podUID="2ab40c56-2ac9-49f5-8370-ff1ae7c5757f" containerName="route-controller-manager" containerID="cri-o://6fb0b9447a2d26404d4ce546e776345ac9036f9ccc4aa0b57f561efbdf3d4e9a" gracePeriod=30 Mar 16 00:14:02 crc kubenswrapper[4816]: I0316 00:14:02.519610 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-69d5f8f747-vqm27" Mar 16 00:14:02 crc kubenswrapper[4816]: I0316 00:14:02.531943 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7b94474fcc-vzgw6" Mar 16 00:14:02 crc kubenswrapper[4816]: I0316 00:14:02.555689 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/14521451-81b6-4214-883a-cd05a9357517-client-ca\") pod \"14521451-81b6-4214-883a-cd05a9357517\" (UID: \"14521451-81b6-4214-883a-cd05a9357517\") " Mar 16 00:14:02 crc kubenswrapper[4816]: I0316 00:14:02.555793 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8tf9t\" (UniqueName: \"kubernetes.io/projected/2ab40c56-2ac9-49f5-8370-ff1ae7c5757f-kube-api-access-8tf9t\") pod \"2ab40c56-2ac9-49f5-8370-ff1ae7c5757f\" (UID: \"2ab40c56-2ac9-49f5-8370-ff1ae7c5757f\") " Mar 16 00:14:02 crc kubenswrapper[4816]: I0316 00:14:02.555857 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/14521451-81b6-4214-883a-cd05a9357517-serving-cert\") pod \"14521451-81b6-4214-883a-cd05a9357517\" (UID: \"14521451-81b6-4214-883a-cd05a9357517\") " Mar 16 00:14:02 crc kubenswrapper[4816]: I0316 00:14:02.555889 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2ab40c56-2ac9-49f5-8370-ff1ae7c5757f-config\") pod \"2ab40c56-2ac9-49f5-8370-ff1ae7c5757f\" (UID: \"2ab40c56-2ac9-49f5-8370-ff1ae7c5757f\") " Mar 16 00:14:02 crc kubenswrapper[4816]: I0316 00:14:02.555962 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2ab40c56-2ac9-49f5-8370-ff1ae7c5757f-client-ca\") pod \"2ab40c56-2ac9-49f5-8370-ff1ae7c5757f\" (UID: \"2ab40c56-2ac9-49f5-8370-ff1ae7c5757f\") " Mar 16 00:14:02 crc kubenswrapper[4816]: I0316 00:14:02.556001 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/14521451-81b6-4214-883a-cd05a9357517-config\") pod \"14521451-81b6-4214-883a-cd05a9357517\" (UID: \"14521451-81b6-4214-883a-cd05a9357517\") " Mar 16 00:14:02 crc kubenswrapper[4816]: I0316 00:14:02.556032 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/14521451-81b6-4214-883a-cd05a9357517-proxy-ca-bundles\") pod \"14521451-81b6-4214-883a-cd05a9357517\" (UID: \"14521451-81b6-4214-883a-cd05a9357517\") " Mar 16 00:14:02 crc kubenswrapper[4816]: I0316 00:14:02.556067 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-khgb4\" (UniqueName: \"kubernetes.io/projected/14521451-81b6-4214-883a-cd05a9357517-kube-api-access-khgb4\") pod \"14521451-81b6-4214-883a-cd05a9357517\" (UID: \"14521451-81b6-4214-883a-cd05a9357517\") " Mar 16 00:14:02 crc kubenswrapper[4816]: I0316 00:14:02.556122 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2ab40c56-2ac9-49f5-8370-ff1ae7c5757f-serving-cert\") pod \"2ab40c56-2ac9-49f5-8370-ff1ae7c5757f\" (UID: \"2ab40c56-2ac9-49f5-8370-ff1ae7c5757f\") " Mar 16 00:14:02 crc kubenswrapper[4816]: I0316 00:14:02.557497 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2ab40c56-2ac9-49f5-8370-ff1ae7c5757f-client-ca" (OuterVolumeSpecName: "client-ca") pod "2ab40c56-2ac9-49f5-8370-ff1ae7c5757f" (UID: "2ab40c56-2ac9-49f5-8370-ff1ae7c5757f"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:14:02 crc kubenswrapper[4816]: I0316 00:14:02.558845 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/14521451-81b6-4214-883a-cd05a9357517-config" (OuterVolumeSpecName: "config") pod "14521451-81b6-4214-883a-cd05a9357517" (UID: "14521451-81b6-4214-883a-cd05a9357517"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:14:02 crc kubenswrapper[4816]: I0316 00:14:02.559305 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2ab40c56-2ac9-49f5-8370-ff1ae7c5757f-config" (OuterVolumeSpecName: "config") pod "2ab40c56-2ac9-49f5-8370-ff1ae7c5757f" (UID: "2ab40c56-2ac9-49f5-8370-ff1ae7c5757f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:14:02 crc kubenswrapper[4816]: I0316 00:14:02.560025 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/14521451-81b6-4214-883a-cd05a9357517-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "14521451-81b6-4214-883a-cd05a9357517" (UID: "14521451-81b6-4214-883a-cd05a9357517"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:14:02 crc kubenswrapper[4816]: I0316 00:14:02.562089 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/14521451-81b6-4214-883a-cd05a9357517-client-ca" (OuterVolumeSpecName: "client-ca") pod "14521451-81b6-4214-883a-cd05a9357517" (UID: "14521451-81b6-4214-883a-cd05a9357517"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:14:02 crc kubenswrapper[4816]: I0316 00:14:02.562707 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2ab40c56-2ac9-49f5-8370-ff1ae7c5757f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "2ab40c56-2ac9-49f5-8370-ff1ae7c5757f" (UID: "2ab40c56-2ac9-49f5-8370-ff1ae7c5757f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 16 00:14:02 crc kubenswrapper[4816]: I0316 00:14:02.565001 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2ab40c56-2ac9-49f5-8370-ff1ae7c5757f-kube-api-access-8tf9t" (OuterVolumeSpecName: "kube-api-access-8tf9t") pod "2ab40c56-2ac9-49f5-8370-ff1ae7c5757f" (UID: "2ab40c56-2ac9-49f5-8370-ff1ae7c5757f"). InnerVolumeSpecName "kube-api-access-8tf9t". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:14:02 crc kubenswrapper[4816]: I0316 00:14:02.565058 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/14521451-81b6-4214-883a-cd05a9357517-kube-api-access-khgb4" (OuterVolumeSpecName: "kube-api-access-khgb4") pod "14521451-81b6-4214-883a-cd05a9357517" (UID: "14521451-81b6-4214-883a-cd05a9357517"). InnerVolumeSpecName "kube-api-access-khgb4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:14:02 crc kubenswrapper[4816]: I0316 00:14:02.566787 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/14521451-81b6-4214-883a-cd05a9357517-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "14521451-81b6-4214-883a-cd05a9357517" (UID: "14521451-81b6-4214-883a-cd05a9357517"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 16 00:14:02 crc kubenswrapper[4816]: I0316 00:14:02.657340 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8tf9t\" (UniqueName: \"kubernetes.io/projected/2ab40c56-2ac9-49f5-8370-ff1ae7c5757f-kube-api-access-8tf9t\") on node \"crc\" DevicePath \"\"" Mar 16 00:14:02 crc kubenswrapper[4816]: I0316 00:14:02.657390 4816 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/14521451-81b6-4214-883a-cd05a9357517-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 16 00:14:02 crc kubenswrapper[4816]: I0316 00:14:02.657404 4816 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2ab40c56-2ac9-49f5-8370-ff1ae7c5757f-config\") on node \"crc\" DevicePath \"\"" Mar 16 00:14:02 crc kubenswrapper[4816]: I0316 00:14:02.657414 4816 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2ab40c56-2ac9-49f5-8370-ff1ae7c5757f-client-ca\") on node \"crc\" DevicePath \"\"" Mar 16 00:14:02 crc kubenswrapper[4816]: I0316 00:14:02.657425 4816 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/14521451-81b6-4214-883a-cd05a9357517-config\") on node \"crc\" DevicePath \"\"" Mar 16 00:14:02 crc kubenswrapper[4816]: I0316 00:14:02.657436 4816 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/14521451-81b6-4214-883a-cd05a9357517-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 16 00:14:02 crc kubenswrapper[4816]: I0316 00:14:02.657445 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-khgb4\" (UniqueName: \"kubernetes.io/projected/14521451-81b6-4214-883a-cd05a9357517-kube-api-access-khgb4\") on node \"crc\" DevicePath \"\"" Mar 16 00:14:02 crc kubenswrapper[4816]: I0316 00:14:02.657455 4816 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2ab40c56-2ac9-49f5-8370-ff1ae7c5757f-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 16 00:14:02 crc kubenswrapper[4816]: I0316 00:14:02.657464 4816 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/14521451-81b6-4214-883a-cd05a9357517-client-ca\") on node \"crc\" DevicePath \"\"" Mar 16 00:14:02 crc kubenswrapper[4816]: I0316 00:14:02.896802 4816 generic.go:334] "Generic (PLEG): container finished" podID="14521451-81b6-4214-883a-cd05a9357517" containerID="83a4b2018bae2ed06df319b69f24a3f9e5ae7f538f9077ab4ab4efca28173ee4" exitCode=0 Mar 16 00:14:02 crc kubenswrapper[4816]: I0316 00:14:02.896875 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7b94474fcc-vzgw6" event={"ID":"14521451-81b6-4214-883a-cd05a9357517","Type":"ContainerDied","Data":"83a4b2018bae2ed06df319b69f24a3f9e5ae7f538f9077ab4ab4efca28173ee4"} Mar 16 00:14:02 crc kubenswrapper[4816]: I0316 00:14:02.896906 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7b94474fcc-vzgw6" event={"ID":"14521451-81b6-4214-883a-cd05a9357517","Type":"ContainerDied","Data":"7fd72c6bda85d63e3016f0f126428c17855b91cdeed58a7acc5610da9342d4f1"} Mar 16 00:14:02 crc kubenswrapper[4816]: I0316 00:14:02.896925 4816 scope.go:117] "RemoveContainer" containerID="83a4b2018bae2ed06df319b69f24a3f9e5ae7f538f9077ab4ab4efca28173ee4" Mar 16 00:14:02 crc kubenswrapper[4816]: I0316 00:14:02.897045 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7b94474fcc-vzgw6" Mar 16 00:14:02 crc kubenswrapper[4816]: I0316 00:14:02.900196 4816 generic.go:334] "Generic (PLEG): container finished" podID="2ab40c56-2ac9-49f5-8370-ff1ae7c5757f" containerID="6fb0b9447a2d26404d4ce546e776345ac9036f9ccc4aa0b57f561efbdf3d4e9a" exitCode=0 Mar 16 00:14:02 crc kubenswrapper[4816]: I0316 00:14:02.900264 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-69d5f8f747-vqm27" event={"ID":"2ab40c56-2ac9-49f5-8370-ff1ae7c5757f","Type":"ContainerDied","Data":"6fb0b9447a2d26404d4ce546e776345ac9036f9ccc4aa0b57f561efbdf3d4e9a"} Mar 16 00:14:02 crc kubenswrapper[4816]: I0316 00:14:02.900324 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-69d5f8f747-vqm27" event={"ID":"2ab40c56-2ac9-49f5-8370-ff1ae7c5757f","Type":"ContainerDied","Data":"916114dd72baadc2e3d1c4e882df4092bba32ea74a936ec4e52471a9ade09699"} Mar 16 00:14:02 crc kubenswrapper[4816]: I0316 00:14:02.900276 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-69d5f8f747-vqm27" Mar 16 00:14:02 crc kubenswrapper[4816]: I0316 00:14:02.904887 4816 generic.go:334] "Generic (PLEG): container finished" podID="5160d394-3d9b-4066-9bea-b9dd787b2a42" containerID="d0a220f8f08fc88ffdf56d37ec2ba1b59974be62f3a81d988b1462b4794a79a8" exitCode=0 Mar 16 00:14:02 crc kubenswrapper[4816]: I0316 00:14:02.904959 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29560334-7sx8j" event={"ID":"5160d394-3d9b-4066-9bea-b9dd787b2a42","Type":"ContainerDied","Data":"d0a220f8f08fc88ffdf56d37ec2ba1b59974be62f3a81d988b1462b4794a79a8"} Mar 16 00:14:02 crc kubenswrapper[4816]: I0316 00:14:02.918217 4816 scope.go:117] "RemoveContainer" containerID="83a4b2018bae2ed06df319b69f24a3f9e5ae7f538f9077ab4ab4efca28173ee4" Mar 16 00:14:02 crc kubenswrapper[4816]: E0316 00:14:02.918618 4816 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"83a4b2018bae2ed06df319b69f24a3f9e5ae7f538f9077ab4ab4efca28173ee4\": container with ID starting with 83a4b2018bae2ed06df319b69f24a3f9e5ae7f538f9077ab4ab4efca28173ee4 not found: ID does not exist" containerID="83a4b2018bae2ed06df319b69f24a3f9e5ae7f538f9077ab4ab4efca28173ee4" Mar 16 00:14:02 crc kubenswrapper[4816]: I0316 00:14:02.918648 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"83a4b2018bae2ed06df319b69f24a3f9e5ae7f538f9077ab4ab4efca28173ee4"} err="failed to get container status \"83a4b2018bae2ed06df319b69f24a3f9e5ae7f538f9077ab4ab4efca28173ee4\": rpc error: code = NotFound desc = could not find container \"83a4b2018bae2ed06df319b69f24a3f9e5ae7f538f9077ab4ab4efca28173ee4\": container with ID starting with 83a4b2018bae2ed06df319b69f24a3f9e5ae7f538f9077ab4ab4efca28173ee4 not found: ID does not exist" Mar 16 00:14:02 crc kubenswrapper[4816]: I0316 00:14:02.919120 4816 scope.go:117] "RemoveContainer" containerID="6fb0b9447a2d26404d4ce546e776345ac9036f9ccc4aa0b57f561efbdf3d4e9a" Mar 16 00:14:02 crc kubenswrapper[4816]: I0316 00:14:02.935397 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-7b94474fcc-vzgw6"] Mar 16 00:14:02 crc kubenswrapper[4816]: I0316 00:14:02.937211 4816 scope.go:117] "RemoveContainer" containerID="6fb0b9447a2d26404d4ce546e776345ac9036f9ccc4aa0b57f561efbdf3d4e9a" Mar 16 00:14:02 crc kubenswrapper[4816]: E0316 00:14:02.937664 4816 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6fb0b9447a2d26404d4ce546e776345ac9036f9ccc4aa0b57f561efbdf3d4e9a\": container with ID starting with 6fb0b9447a2d26404d4ce546e776345ac9036f9ccc4aa0b57f561efbdf3d4e9a not found: ID does not exist" containerID="6fb0b9447a2d26404d4ce546e776345ac9036f9ccc4aa0b57f561efbdf3d4e9a" Mar 16 00:14:02 crc kubenswrapper[4816]: I0316 00:14:02.937707 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6fb0b9447a2d26404d4ce546e776345ac9036f9ccc4aa0b57f561efbdf3d4e9a"} err="failed to get container status \"6fb0b9447a2d26404d4ce546e776345ac9036f9ccc4aa0b57f561efbdf3d4e9a\": rpc error: code = NotFound desc = could not find container \"6fb0b9447a2d26404d4ce546e776345ac9036f9ccc4aa0b57f561efbdf3d4e9a\": container with ID starting with 6fb0b9447a2d26404d4ce546e776345ac9036f9ccc4aa0b57f561efbdf3d4e9a not found: ID does not exist" Mar 16 00:14:02 crc kubenswrapper[4816]: I0316 00:14:02.938811 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-7b94474fcc-vzgw6"] Mar 16 00:14:02 crc kubenswrapper[4816]: I0316 00:14:02.949230 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-69d5f8f747-vqm27"] Mar 16 00:14:02 crc kubenswrapper[4816]: I0316 00:14:02.955394 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-69d5f8f747-vqm27"] Mar 16 00:14:03 crc kubenswrapper[4816]: I0316 00:14:03.522183 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-75955db86c-zz9wj"] Mar 16 00:14:03 crc kubenswrapper[4816]: E0316 00:14:03.522526 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="14521451-81b6-4214-883a-cd05a9357517" containerName="controller-manager" Mar 16 00:14:03 crc kubenswrapper[4816]: I0316 00:14:03.522594 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="14521451-81b6-4214-883a-cd05a9357517" containerName="controller-manager" Mar 16 00:14:03 crc kubenswrapper[4816]: E0316 00:14:03.522632 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2ab40c56-2ac9-49f5-8370-ff1ae7c5757f" containerName="route-controller-manager" Mar 16 00:14:03 crc kubenswrapper[4816]: I0316 00:14:03.522649 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="2ab40c56-2ac9-49f5-8370-ff1ae7c5757f" containerName="route-controller-manager" Mar 16 00:14:03 crc kubenswrapper[4816]: I0316 00:14:03.522819 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="14521451-81b6-4214-883a-cd05a9357517" containerName="controller-manager" Mar 16 00:14:03 crc kubenswrapper[4816]: I0316 00:14:03.522857 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="2ab40c56-2ac9-49f5-8370-ff1ae7c5757f" containerName="route-controller-manager" Mar 16 00:14:03 crc kubenswrapper[4816]: I0316 00:14:03.523446 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-75955db86c-zz9wj" Mar 16 00:14:03 crc kubenswrapper[4816]: I0316 00:14:03.525309 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 16 00:14:03 crc kubenswrapper[4816]: I0316 00:14:03.525901 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 16 00:14:03 crc kubenswrapper[4816]: I0316 00:14:03.526059 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 16 00:14:03 crc kubenswrapper[4816]: I0316 00:14:03.526355 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 16 00:14:03 crc kubenswrapper[4816]: I0316 00:14:03.526632 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 16 00:14:03 crc kubenswrapper[4816]: I0316 00:14:03.528718 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 16 00:14:03 crc kubenswrapper[4816]: I0316 00:14:03.530322 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-66849d8997-mdw7r"] Mar 16 00:14:03 crc kubenswrapper[4816]: I0316 00:14:03.531152 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-66849d8997-mdw7r" Mar 16 00:14:03 crc kubenswrapper[4816]: I0316 00:14:03.534132 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 16 00:14:03 crc kubenswrapper[4816]: I0316 00:14:03.534153 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 16 00:14:03 crc kubenswrapper[4816]: I0316 00:14:03.534156 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 16 00:14:03 crc kubenswrapper[4816]: I0316 00:14:03.535881 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 16 00:14:03 crc kubenswrapper[4816]: I0316 00:14:03.535970 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 16 00:14:03 crc kubenswrapper[4816]: I0316 00:14:03.537219 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 16 00:14:03 crc kubenswrapper[4816]: I0316 00:14:03.540988 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 16 00:14:03 crc kubenswrapper[4816]: I0316 00:14:03.541721 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-75955db86c-zz9wj"] Mar 16 00:14:03 crc kubenswrapper[4816]: I0316 00:14:03.546466 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-66849d8997-mdw7r"] Mar 16 00:14:03 crc kubenswrapper[4816]: I0316 00:14:03.569052 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d54cdab6-e848-4a30-b64f-7b257a403479-config\") pod \"controller-manager-66849d8997-mdw7r\" (UID: \"d54cdab6-e848-4a30-b64f-7b257a403479\") " pod="openshift-controller-manager/controller-manager-66849d8997-mdw7r" Mar 16 00:14:03 crc kubenswrapper[4816]: I0316 00:14:03.569345 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ee03d74f-fc71-4caa-b296-7bde75124d84-client-ca\") pod \"route-controller-manager-75955db86c-zz9wj\" (UID: \"ee03d74f-fc71-4caa-b296-7bde75124d84\") " pod="openshift-route-controller-manager/route-controller-manager-75955db86c-zz9wj" Mar 16 00:14:03 crc kubenswrapper[4816]: I0316 00:14:03.569503 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d54cdab6-e848-4a30-b64f-7b257a403479-client-ca\") pod \"controller-manager-66849d8997-mdw7r\" (UID: \"d54cdab6-e848-4a30-b64f-7b257a403479\") " pod="openshift-controller-manager/controller-manager-66849d8997-mdw7r" Mar 16 00:14:03 crc kubenswrapper[4816]: I0316 00:14:03.569653 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ee03d74f-fc71-4caa-b296-7bde75124d84-serving-cert\") pod \"route-controller-manager-75955db86c-zz9wj\" (UID: \"ee03d74f-fc71-4caa-b296-7bde75124d84\") " pod="openshift-route-controller-manager/route-controller-manager-75955db86c-zz9wj" Mar 16 00:14:03 crc kubenswrapper[4816]: I0316 00:14:03.569826 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ee03d74f-fc71-4caa-b296-7bde75124d84-config\") pod \"route-controller-manager-75955db86c-zz9wj\" (UID: \"ee03d74f-fc71-4caa-b296-7bde75124d84\") " pod="openshift-route-controller-manager/route-controller-manager-75955db86c-zz9wj" Mar 16 00:14:03 crc kubenswrapper[4816]: I0316 00:14:03.569905 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b56ft\" (UniqueName: \"kubernetes.io/projected/ee03d74f-fc71-4caa-b296-7bde75124d84-kube-api-access-b56ft\") pod \"route-controller-manager-75955db86c-zz9wj\" (UID: \"ee03d74f-fc71-4caa-b296-7bde75124d84\") " pod="openshift-route-controller-manager/route-controller-manager-75955db86c-zz9wj" Mar 16 00:14:03 crc kubenswrapper[4816]: I0316 00:14:03.569941 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tpr8t\" (UniqueName: \"kubernetes.io/projected/d54cdab6-e848-4a30-b64f-7b257a403479-kube-api-access-tpr8t\") pod \"controller-manager-66849d8997-mdw7r\" (UID: \"d54cdab6-e848-4a30-b64f-7b257a403479\") " pod="openshift-controller-manager/controller-manager-66849d8997-mdw7r" Mar 16 00:14:03 crc kubenswrapper[4816]: I0316 00:14:03.569988 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d54cdab6-e848-4a30-b64f-7b257a403479-proxy-ca-bundles\") pod \"controller-manager-66849d8997-mdw7r\" (UID: \"d54cdab6-e848-4a30-b64f-7b257a403479\") " pod="openshift-controller-manager/controller-manager-66849d8997-mdw7r" Mar 16 00:14:03 crc kubenswrapper[4816]: I0316 00:14:03.570016 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d54cdab6-e848-4a30-b64f-7b257a403479-serving-cert\") pod \"controller-manager-66849d8997-mdw7r\" (UID: \"d54cdab6-e848-4a30-b64f-7b257a403479\") " pod="openshift-controller-manager/controller-manager-66849d8997-mdw7r" Mar 16 00:14:03 crc kubenswrapper[4816]: I0316 00:14:03.670533 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ee03d74f-fc71-4caa-b296-7bde75124d84-client-ca\") pod \"route-controller-manager-75955db86c-zz9wj\" (UID: \"ee03d74f-fc71-4caa-b296-7bde75124d84\") " pod="openshift-route-controller-manager/route-controller-manager-75955db86c-zz9wj" Mar 16 00:14:03 crc kubenswrapper[4816]: I0316 00:14:03.670630 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d54cdab6-e848-4a30-b64f-7b257a403479-client-ca\") pod \"controller-manager-66849d8997-mdw7r\" (UID: \"d54cdab6-e848-4a30-b64f-7b257a403479\") " pod="openshift-controller-manager/controller-manager-66849d8997-mdw7r" Mar 16 00:14:03 crc kubenswrapper[4816]: I0316 00:14:03.670688 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ee03d74f-fc71-4caa-b296-7bde75124d84-serving-cert\") pod \"route-controller-manager-75955db86c-zz9wj\" (UID: \"ee03d74f-fc71-4caa-b296-7bde75124d84\") " pod="openshift-route-controller-manager/route-controller-manager-75955db86c-zz9wj" Mar 16 00:14:03 crc kubenswrapper[4816]: I0316 00:14:03.670752 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ee03d74f-fc71-4caa-b296-7bde75124d84-config\") pod \"route-controller-manager-75955db86c-zz9wj\" (UID: \"ee03d74f-fc71-4caa-b296-7bde75124d84\") " pod="openshift-route-controller-manager/route-controller-manager-75955db86c-zz9wj" Mar 16 00:14:03 crc kubenswrapper[4816]: I0316 00:14:03.670785 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b56ft\" (UniqueName: \"kubernetes.io/projected/ee03d74f-fc71-4caa-b296-7bde75124d84-kube-api-access-b56ft\") pod \"route-controller-manager-75955db86c-zz9wj\" (UID: \"ee03d74f-fc71-4caa-b296-7bde75124d84\") " pod="openshift-route-controller-manager/route-controller-manager-75955db86c-zz9wj" Mar 16 00:14:03 crc kubenswrapper[4816]: I0316 00:14:03.670806 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tpr8t\" (UniqueName: \"kubernetes.io/projected/d54cdab6-e848-4a30-b64f-7b257a403479-kube-api-access-tpr8t\") pod \"controller-manager-66849d8997-mdw7r\" (UID: \"d54cdab6-e848-4a30-b64f-7b257a403479\") " pod="openshift-controller-manager/controller-manager-66849d8997-mdw7r" Mar 16 00:14:03 crc kubenswrapper[4816]: I0316 00:14:03.670835 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d54cdab6-e848-4a30-b64f-7b257a403479-proxy-ca-bundles\") pod \"controller-manager-66849d8997-mdw7r\" (UID: \"d54cdab6-e848-4a30-b64f-7b257a403479\") " pod="openshift-controller-manager/controller-manager-66849d8997-mdw7r" Mar 16 00:14:03 crc kubenswrapper[4816]: I0316 00:14:03.670855 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d54cdab6-e848-4a30-b64f-7b257a403479-serving-cert\") pod \"controller-manager-66849d8997-mdw7r\" (UID: \"d54cdab6-e848-4a30-b64f-7b257a403479\") " pod="openshift-controller-manager/controller-manager-66849d8997-mdw7r" Mar 16 00:14:03 crc kubenswrapper[4816]: I0316 00:14:03.670904 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d54cdab6-e848-4a30-b64f-7b257a403479-config\") pod \"controller-manager-66849d8997-mdw7r\" (UID: \"d54cdab6-e848-4a30-b64f-7b257a403479\") " pod="openshift-controller-manager/controller-manager-66849d8997-mdw7r" Mar 16 00:14:03 crc kubenswrapper[4816]: I0316 00:14:03.671942 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d54cdab6-e848-4a30-b64f-7b257a403479-client-ca\") pod \"controller-manager-66849d8997-mdw7r\" (UID: \"d54cdab6-e848-4a30-b64f-7b257a403479\") " pod="openshift-controller-manager/controller-manager-66849d8997-mdw7r" Mar 16 00:14:03 crc kubenswrapper[4816]: I0316 00:14:03.672801 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d54cdab6-e848-4a30-b64f-7b257a403479-proxy-ca-bundles\") pod \"controller-manager-66849d8997-mdw7r\" (UID: \"d54cdab6-e848-4a30-b64f-7b257a403479\") " pod="openshift-controller-manager/controller-manager-66849d8997-mdw7r" Mar 16 00:14:03 crc kubenswrapper[4816]: I0316 00:14:03.672951 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ee03d74f-fc71-4caa-b296-7bde75124d84-client-ca\") pod \"route-controller-manager-75955db86c-zz9wj\" (UID: \"ee03d74f-fc71-4caa-b296-7bde75124d84\") " pod="openshift-route-controller-manager/route-controller-manager-75955db86c-zz9wj" Mar 16 00:14:03 crc kubenswrapper[4816]: I0316 00:14:03.674397 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d54cdab6-e848-4a30-b64f-7b257a403479-config\") pod \"controller-manager-66849d8997-mdw7r\" (UID: \"d54cdab6-e848-4a30-b64f-7b257a403479\") " pod="openshift-controller-manager/controller-manager-66849d8997-mdw7r" Mar 16 00:14:03 crc kubenswrapper[4816]: I0316 00:14:03.675541 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ee03d74f-fc71-4caa-b296-7bde75124d84-serving-cert\") pod \"route-controller-manager-75955db86c-zz9wj\" (UID: \"ee03d74f-fc71-4caa-b296-7bde75124d84\") " pod="openshift-route-controller-manager/route-controller-manager-75955db86c-zz9wj" Mar 16 00:14:03 crc kubenswrapper[4816]: I0316 00:14:03.676047 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="14521451-81b6-4214-883a-cd05a9357517" path="/var/lib/kubelet/pods/14521451-81b6-4214-883a-cd05a9357517/volumes" Mar 16 00:14:03 crc kubenswrapper[4816]: I0316 00:14:03.676763 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2ab40c56-2ac9-49f5-8370-ff1ae7c5757f" path="/var/lib/kubelet/pods/2ab40c56-2ac9-49f5-8370-ff1ae7c5757f/volumes" Mar 16 00:14:03 crc kubenswrapper[4816]: I0316 00:14:03.686411 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d54cdab6-e848-4a30-b64f-7b257a403479-serving-cert\") pod \"controller-manager-66849d8997-mdw7r\" (UID: \"d54cdab6-e848-4a30-b64f-7b257a403479\") " pod="openshift-controller-manager/controller-manager-66849d8997-mdw7r" Mar 16 00:14:03 crc kubenswrapper[4816]: I0316 00:14:03.687308 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ee03d74f-fc71-4caa-b296-7bde75124d84-config\") pod \"route-controller-manager-75955db86c-zz9wj\" (UID: \"ee03d74f-fc71-4caa-b296-7bde75124d84\") " pod="openshift-route-controller-manager/route-controller-manager-75955db86c-zz9wj" Mar 16 00:14:03 crc kubenswrapper[4816]: I0316 00:14:03.690032 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tpr8t\" (UniqueName: \"kubernetes.io/projected/d54cdab6-e848-4a30-b64f-7b257a403479-kube-api-access-tpr8t\") pod \"controller-manager-66849d8997-mdw7r\" (UID: \"d54cdab6-e848-4a30-b64f-7b257a403479\") " pod="openshift-controller-manager/controller-manager-66849d8997-mdw7r" Mar 16 00:14:03 crc kubenswrapper[4816]: I0316 00:14:03.692239 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b56ft\" (UniqueName: \"kubernetes.io/projected/ee03d74f-fc71-4caa-b296-7bde75124d84-kube-api-access-b56ft\") pod \"route-controller-manager-75955db86c-zz9wj\" (UID: \"ee03d74f-fc71-4caa-b296-7bde75124d84\") " pod="openshift-route-controller-manager/route-controller-manager-75955db86c-zz9wj" Mar 16 00:14:03 crc kubenswrapper[4816]: I0316 00:14:03.850173 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-75955db86c-zz9wj" Mar 16 00:14:03 crc kubenswrapper[4816]: I0316 00:14:03.863869 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-66849d8997-mdw7r" Mar 16 00:14:04 crc kubenswrapper[4816]: I0316 00:14:04.166111 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29560334-7sx8j" Mar 16 00:14:04 crc kubenswrapper[4816]: I0316 00:14:04.177170 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t5d8q\" (UniqueName: \"kubernetes.io/projected/5160d394-3d9b-4066-9bea-b9dd787b2a42-kube-api-access-t5d8q\") pod \"5160d394-3d9b-4066-9bea-b9dd787b2a42\" (UID: \"5160d394-3d9b-4066-9bea-b9dd787b2a42\") " Mar 16 00:14:04 crc kubenswrapper[4816]: I0316 00:14:04.182119 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5160d394-3d9b-4066-9bea-b9dd787b2a42-kube-api-access-t5d8q" (OuterVolumeSpecName: "kube-api-access-t5d8q") pod "5160d394-3d9b-4066-9bea-b9dd787b2a42" (UID: "5160d394-3d9b-4066-9bea-b9dd787b2a42"). InnerVolumeSpecName "kube-api-access-t5d8q". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:14:04 crc kubenswrapper[4816]: I0316 00:14:04.277670 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t5d8q\" (UniqueName: \"kubernetes.io/projected/5160d394-3d9b-4066-9bea-b9dd787b2a42-kube-api-access-t5d8q\") on node \"crc\" DevicePath \"\"" Mar 16 00:14:04 crc kubenswrapper[4816]: I0316 00:14:04.304409 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-66849d8997-mdw7r"] Mar 16 00:14:04 crc kubenswrapper[4816]: W0316 00:14:04.309436 4816 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd54cdab6_e848_4a30_b64f_7b257a403479.slice/crio-740a40d4e9799ef7106757215750d89d9a0af1e6c2b93294ab89c6e7631aa93f WatchSource:0}: Error finding container 740a40d4e9799ef7106757215750d89d9a0af1e6c2b93294ab89c6e7631aa93f: Status 404 returned error can't find the container with id 740a40d4e9799ef7106757215750d89d9a0af1e6c2b93294ab89c6e7631aa93f Mar 16 00:14:04 crc kubenswrapper[4816]: I0316 00:14:04.361696 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-75955db86c-zz9wj"] Mar 16 00:14:04 crc kubenswrapper[4816]: I0316 00:14:04.917371 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29560334-7sx8j" event={"ID":"5160d394-3d9b-4066-9bea-b9dd787b2a42","Type":"ContainerDied","Data":"a091429ab10db04195b4dce0908b618f38a120a1d24a9b0b32f5812cce430ee8"} Mar 16 00:14:04 crc kubenswrapper[4816]: I0316 00:14:04.917719 4816 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a091429ab10db04195b4dce0908b618f38a120a1d24a9b0b32f5812cce430ee8" Mar 16 00:14:04 crc kubenswrapper[4816]: I0316 00:14:04.917632 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29560334-7sx8j" Mar 16 00:14:04 crc kubenswrapper[4816]: I0316 00:14:04.919047 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-75955db86c-zz9wj" event={"ID":"ee03d74f-fc71-4caa-b296-7bde75124d84","Type":"ContainerStarted","Data":"0ee857504da73463e9d37b707f935efcd42a2a931f2b0b482fe4a93572c57848"} Mar 16 00:14:04 crc kubenswrapper[4816]: I0316 00:14:04.919098 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-75955db86c-zz9wj" event={"ID":"ee03d74f-fc71-4caa-b296-7bde75124d84","Type":"ContainerStarted","Data":"3979922d6fce0cead24bc526158f3a5bdcfa05832f696a88b2f4edcb0bfa3aa5"} Mar 16 00:14:04 crc kubenswrapper[4816]: I0316 00:14:04.919346 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-75955db86c-zz9wj" Mar 16 00:14:04 crc kubenswrapper[4816]: I0316 00:14:04.921382 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-66849d8997-mdw7r" event={"ID":"d54cdab6-e848-4a30-b64f-7b257a403479","Type":"ContainerStarted","Data":"f446c2ae65638de11812a4e1adcc4638681b15af837aa10730e65bcb03368dfd"} Mar 16 00:14:04 crc kubenswrapper[4816]: I0316 00:14:04.921766 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-66849d8997-mdw7r" event={"ID":"d54cdab6-e848-4a30-b64f-7b257a403479","Type":"ContainerStarted","Data":"740a40d4e9799ef7106757215750d89d9a0af1e6c2b93294ab89c6e7631aa93f"} Mar 16 00:14:04 crc kubenswrapper[4816]: I0316 00:14:04.921798 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-66849d8997-mdw7r" Mar 16 00:14:04 crc kubenswrapper[4816]: I0316 00:14:04.926430 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-75955db86c-zz9wj" Mar 16 00:14:04 crc kubenswrapper[4816]: I0316 00:14:04.926870 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-66849d8997-mdw7r" Mar 16 00:14:04 crc kubenswrapper[4816]: I0316 00:14:04.940771 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-75955db86c-zz9wj" podStartSLOduration=2.940753571 podStartE2EDuration="2.940753571s" podCreationTimestamp="2026-03-16 00:14:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-16 00:14:04.938971989 +0000 UTC m=+438.035271942" watchObservedRunningTime="2026-03-16 00:14:04.940753571 +0000 UTC m=+438.037053524" Mar 16 00:14:04 crc kubenswrapper[4816]: I0316 00:14:04.980810 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-66849d8997-mdw7r" podStartSLOduration=3.98078479 podStartE2EDuration="3.98078479s" podCreationTimestamp="2026-03-16 00:14:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-16 00:14:04.979651817 +0000 UTC m=+438.075951780" watchObservedRunningTime="2026-03-16 00:14:04.98078479 +0000 UTC m=+438.077084743" Mar 16 00:14:21 crc kubenswrapper[4816]: I0316 00:14:21.898170 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-66849d8997-mdw7r"] Mar 16 00:14:21 crc kubenswrapper[4816]: I0316 00:14:21.899021 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-66849d8997-mdw7r" podUID="d54cdab6-e848-4a30-b64f-7b257a403479" containerName="controller-manager" containerID="cri-o://f446c2ae65638de11812a4e1adcc4638681b15af837aa10730e65bcb03368dfd" gracePeriod=30 Mar 16 00:14:21 crc kubenswrapper[4816]: I0316 00:14:21.931320 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-75955db86c-zz9wj"] Mar 16 00:14:21 crc kubenswrapper[4816]: I0316 00:14:21.931531 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-75955db86c-zz9wj" podUID="ee03d74f-fc71-4caa-b296-7bde75124d84" containerName="route-controller-manager" containerID="cri-o://0ee857504da73463e9d37b707f935efcd42a2a931f2b0b482fe4a93572c57848" gracePeriod=30 Mar 16 00:14:22 crc kubenswrapper[4816]: I0316 00:14:22.035351 4816 generic.go:334] "Generic (PLEG): container finished" podID="d54cdab6-e848-4a30-b64f-7b257a403479" containerID="f446c2ae65638de11812a4e1adcc4638681b15af837aa10730e65bcb03368dfd" exitCode=0 Mar 16 00:14:22 crc kubenswrapper[4816]: I0316 00:14:22.035393 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-66849d8997-mdw7r" event={"ID":"d54cdab6-e848-4a30-b64f-7b257a403479","Type":"ContainerDied","Data":"f446c2ae65638de11812a4e1adcc4638681b15af837aa10730e65bcb03368dfd"} Mar 16 00:14:22 crc kubenswrapper[4816]: I0316 00:14:22.446770 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-75955db86c-zz9wj" Mar 16 00:14:22 crc kubenswrapper[4816]: I0316 00:14:22.544368 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-66849d8997-mdw7r" Mar 16 00:14:22 crc kubenswrapper[4816]: I0316 00:14:22.626299 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ee03d74f-fc71-4caa-b296-7bde75124d84-serving-cert\") pod \"ee03d74f-fc71-4caa-b296-7bde75124d84\" (UID: \"ee03d74f-fc71-4caa-b296-7bde75124d84\") " Mar 16 00:14:22 crc kubenswrapper[4816]: I0316 00:14:22.626354 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ee03d74f-fc71-4caa-b296-7bde75124d84-config\") pod \"ee03d74f-fc71-4caa-b296-7bde75124d84\" (UID: \"ee03d74f-fc71-4caa-b296-7bde75124d84\") " Mar 16 00:14:22 crc kubenswrapper[4816]: I0316 00:14:22.626478 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d54cdab6-e848-4a30-b64f-7b257a403479-config\") pod \"d54cdab6-e848-4a30-b64f-7b257a403479\" (UID: \"d54cdab6-e848-4a30-b64f-7b257a403479\") " Mar 16 00:14:22 crc kubenswrapper[4816]: I0316 00:14:22.626506 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b56ft\" (UniqueName: \"kubernetes.io/projected/ee03d74f-fc71-4caa-b296-7bde75124d84-kube-api-access-b56ft\") pod \"ee03d74f-fc71-4caa-b296-7bde75124d84\" (UID: \"ee03d74f-fc71-4caa-b296-7bde75124d84\") " Mar 16 00:14:22 crc kubenswrapper[4816]: I0316 00:14:22.626540 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d54cdab6-e848-4a30-b64f-7b257a403479-proxy-ca-bundles\") pod \"d54cdab6-e848-4a30-b64f-7b257a403479\" (UID: \"d54cdab6-e848-4a30-b64f-7b257a403479\") " Mar 16 00:14:22 crc kubenswrapper[4816]: I0316 00:14:22.626581 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ee03d74f-fc71-4caa-b296-7bde75124d84-client-ca\") pod \"ee03d74f-fc71-4caa-b296-7bde75124d84\" (UID: \"ee03d74f-fc71-4caa-b296-7bde75124d84\") " Mar 16 00:14:22 crc kubenswrapper[4816]: I0316 00:14:22.627240 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ee03d74f-fc71-4caa-b296-7bde75124d84-config" (OuterVolumeSpecName: "config") pod "ee03d74f-fc71-4caa-b296-7bde75124d84" (UID: "ee03d74f-fc71-4caa-b296-7bde75124d84"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:14:22 crc kubenswrapper[4816]: I0316 00:14:22.627308 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tpr8t\" (UniqueName: \"kubernetes.io/projected/d54cdab6-e848-4a30-b64f-7b257a403479-kube-api-access-tpr8t\") pod \"d54cdab6-e848-4a30-b64f-7b257a403479\" (UID: \"d54cdab6-e848-4a30-b64f-7b257a403479\") " Mar 16 00:14:22 crc kubenswrapper[4816]: I0316 00:14:22.627337 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d54cdab6-e848-4a30-b64f-7b257a403479-client-ca\") pod \"d54cdab6-e848-4a30-b64f-7b257a403479\" (UID: \"d54cdab6-e848-4a30-b64f-7b257a403479\") " Mar 16 00:14:22 crc kubenswrapper[4816]: I0316 00:14:22.627456 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d54cdab6-e848-4a30-b64f-7b257a403479-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "d54cdab6-e848-4a30-b64f-7b257a403479" (UID: "d54cdab6-e848-4a30-b64f-7b257a403479"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:14:22 crc kubenswrapper[4816]: I0316 00:14:22.627502 4816 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ee03d74f-fc71-4caa-b296-7bde75124d84-config\") on node \"crc\" DevicePath \"\"" Mar 16 00:14:22 crc kubenswrapper[4816]: I0316 00:14:22.627501 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d54cdab6-e848-4a30-b64f-7b257a403479-config" (OuterVolumeSpecName: "config") pod "d54cdab6-e848-4a30-b64f-7b257a403479" (UID: "d54cdab6-e848-4a30-b64f-7b257a403479"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:14:22 crc kubenswrapper[4816]: I0316 00:14:22.627739 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d54cdab6-e848-4a30-b64f-7b257a403479-client-ca" (OuterVolumeSpecName: "client-ca") pod "d54cdab6-e848-4a30-b64f-7b257a403479" (UID: "d54cdab6-e848-4a30-b64f-7b257a403479"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:14:22 crc kubenswrapper[4816]: I0316 00:14:22.627927 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ee03d74f-fc71-4caa-b296-7bde75124d84-client-ca" (OuterVolumeSpecName: "client-ca") pod "ee03d74f-fc71-4caa-b296-7bde75124d84" (UID: "ee03d74f-fc71-4caa-b296-7bde75124d84"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:14:22 crc kubenswrapper[4816]: I0316 00:14:22.632418 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ee03d74f-fc71-4caa-b296-7bde75124d84-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "ee03d74f-fc71-4caa-b296-7bde75124d84" (UID: "ee03d74f-fc71-4caa-b296-7bde75124d84"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 16 00:14:22 crc kubenswrapper[4816]: I0316 00:14:22.632478 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ee03d74f-fc71-4caa-b296-7bde75124d84-kube-api-access-b56ft" (OuterVolumeSpecName: "kube-api-access-b56ft") pod "ee03d74f-fc71-4caa-b296-7bde75124d84" (UID: "ee03d74f-fc71-4caa-b296-7bde75124d84"). InnerVolumeSpecName "kube-api-access-b56ft". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:14:22 crc kubenswrapper[4816]: I0316 00:14:22.632723 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d54cdab6-e848-4a30-b64f-7b257a403479-kube-api-access-tpr8t" (OuterVolumeSpecName: "kube-api-access-tpr8t") pod "d54cdab6-e848-4a30-b64f-7b257a403479" (UID: "d54cdab6-e848-4a30-b64f-7b257a403479"). InnerVolumeSpecName "kube-api-access-tpr8t". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:14:22 crc kubenswrapper[4816]: I0316 00:14:22.728033 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d54cdab6-e848-4a30-b64f-7b257a403479-serving-cert\") pod \"d54cdab6-e848-4a30-b64f-7b257a403479\" (UID: \"d54cdab6-e848-4a30-b64f-7b257a403479\") " Mar 16 00:14:22 crc kubenswrapper[4816]: I0316 00:14:22.728326 4816 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ee03d74f-fc71-4caa-b296-7bde75124d84-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 16 00:14:22 crc kubenswrapper[4816]: I0316 00:14:22.728801 4816 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d54cdab6-e848-4a30-b64f-7b257a403479-config\") on node \"crc\" DevicePath \"\"" Mar 16 00:14:22 crc kubenswrapper[4816]: I0316 00:14:22.728824 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b56ft\" (UniqueName: \"kubernetes.io/projected/ee03d74f-fc71-4caa-b296-7bde75124d84-kube-api-access-b56ft\") on node \"crc\" DevicePath \"\"" Mar 16 00:14:22 crc kubenswrapper[4816]: I0316 00:14:22.728837 4816 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d54cdab6-e848-4a30-b64f-7b257a403479-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 16 00:14:22 crc kubenswrapper[4816]: I0316 00:14:22.728865 4816 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ee03d74f-fc71-4caa-b296-7bde75124d84-client-ca\") on node \"crc\" DevicePath \"\"" Mar 16 00:14:22 crc kubenswrapper[4816]: I0316 00:14:22.728878 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tpr8t\" (UniqueName: \"kubernetes.io/projected/d54cdab6-e848-4a30-b64f-7b257a403479-kube-api-access-tpr8t\") on node \"crc\" DevicePath \"\"" Mar 16 00:14:22 crc kubenswrapper[4816]: I0316 00:14:22.728889 4816 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d54cdab6-e848-4a30-b64f-7b257a403479-client-ca\") on node \"crc\" DevicePath \"\"" Mar 16 00:14:22 crc kubenswrapper[4816]: I0316 00:14:22.730677 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d54cdab6-e848-4a30-b64f-7b257a403479-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "d54cdab6-e848-4a30-b64f-7b257a403479" (UID: "d54cdab6-e848-4a30-b64f-7b257a403479"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 16 00:14:22 crc kubenswrapper[4816]: I0316 00:14:22.831477 4816 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d54cdab6-e848-4a30-b64f-7b257a403479-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 16 00:14:23 crc kubenswrapper[4816]: I0316 00:14:23.044391 4816 generic.go:334] "Generic (PLEG): container finished" podID="ee03d74f-fc71-4caa-b296-7bde75124d84" containerID="0ee857504da73463e9d37b707f935efcd42a2a931f2b0b482fe4a93572c57848" exitCode=0 Mar 16 00:14:23 crc kubenswrapper[4816]: I0316 00:14:23.044582 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-75955db86c-zz9wj" Mar 16 00:14:23 crc kubenswrapper[4816]: I0316 00:14:23.044591 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-75955db86c-zz9wj" event={"ID":"ee03d74f-fc71-4caa-b296-7bde75124d84","Type":"ContainerDied","Data":"0ee857504da73463e9d37b707f935efcd42a2a931f2b0b482fe4a93572c57848"} Mar 16 00:14:23 crc kubenswrapper[4816]: I0316 00:14:23.044751 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-75955db86c-zz9wj" event={"ID":"ee03d74f-fc71-4caa-b296-7bde75124d84","Type":"ContainerDied","Data":"3979922d6fce0cead24bc526158f3a5bdcfa05832f696a88b2f4edcb0bfa3aa5"} Mar 16 00:14:23 crc kubenswrapper[4816]: I0316 00:14:23.044788 4816 scope.go:117] "RemoveContainer" containerID="0ee857504da73463e9d37b707f935efcd42a2a931f2b0b482fe4a93572c57848" Mar 16 00:14:23 crc kubenswrapper[4816]: I0316 00:14:23.047465 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-66849d8997-mdw7r" event={"ID":"d54cdab6-e848-4a30-b64f-7b257a403479","Type":"ContainerDied","Data":"740a40d4e9799ef7106757215750d89d9a0af1e6c2b93294ab89c6e7631aa93f"} Mar 16 00:14:23 crc kubenswrapper[4816]: I0316 00:14:23.047706 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-66849d8997-mdw7r" Mar 16 00:14:23 crc kubenswrapper[4816]: I0316 00:14:23.068364 4816 scope.go:117] "RemoveContainer" containerID="0ee857504da73463e9d37b707f935efcd42a2a931f2b0b482fe4a93572c57848" Mar 16 00:14:23 crc kubenswrapper[4816]: E0316 00:14:23.070454 4816 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0ee857504da73463e9d37b707f935efcd42a2a931f2b0b482fe4a93572c57848\": container with ID starting with 0ee857504da73463e9d37b707f935efcd42a2a931f2b0b482fe4a93572c57848 not found: ID does not exist" containerID="0ee857504da73463e9d37b707f935efcd42a2a931f2b0b482fe4a93572c57848" Mar 16 00:14:23 crc kubenswrapper[4816]: I0316 00:14:23.070497 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0ee857504da73463e9d37b707f935efcd42a2a931f2b0b482fe4a93572c57848"} err="failed to get container status \"0ee857504da73463e9d37b707f935efcd42a2a931f2b0b482fe4a93572c57848\": rpc error: code = NotFound desc = could not find container \"0ee857504da73463e9d37b707f935efcd42a2a931f2b0b482fe4a93572c57848\": container with ID starting with 0ee857504da73463e9d37b707f935efcd42a2a931f2b0b482fe4a93572c57848 not found: ID does not exist" Mar 16 00:14:23 crc kubenswrapper[4816]: I0316 00:14:23.070523 4816 scope.go:117] "RemoveContainer" containerID="f446c2ae65638de11812a4e1adcc4638681b15af837aa10730e65bcb03368dfd" Mar 16 00:14:23 crc kubenswrapper[4816]: I0316 00:14:23.102754 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-75955db86c-zz9wj"] Mar 16 00:14:23 crc kubenswrapper[4816]: I0316 00:14:23.103881 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-75955db86c-zz9wj"] Mar 16 00:14:23 crc kubenswrapper[4816]: I0316 00:14:23.119392 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-66849d8997-mdw7r"] Mar 16 00:14:23 crc kubenswrapper[4816]: I0316 00:14:23.124057 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-66849d8997-mdw7r"] Mar 16 00:14:23 crc kubenswrapper[4816]: I0316 00:14:23.539712 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5cff4cf95-6lkfg"] Mar 16 00:14:23 crc kubenswrapper[4816]: E0316 00:14:23.539978 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5160d394-3d9b-4066-9bea-b9dd787b2a42" containerName="oc" Mar 16 00:14:23 crc kubenswrapper[4816]: I0316 00:14:23.539992 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="5160d394-3d9b-4066-9bea-b9dd787b2a42" containerName="oc" Mar 16 00:14:23 crc kubenswrapper[4816]: E0316 00:14:23.540019 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d54cdab6-e848-4a30-b64f-7b257a403479" containerName="controller-manager" Mar 16 00:14:23 crc kubenswrapper[4816]: I0316 00:14:23.540027 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="d54cdab6-e848-4a30-b64f-7b257a403479" containerName="controller-manager" Mar 16 00:14:23 crc kubenswrapper[4816]: E0316 00:14:23.540044 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ee03d74f-fc71-4caa-b296-7bde75124d84" containerName="route-controller-manager" Mar 16 00:14:23 crc kubenswrapper[4816]: I0316 00:14:23.540053 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="ee03d74f-fc71-4caa-b296-7bde75124d84" containerName="route-controller-manager" Mar 16 00:14:23 crc kubenswrapper[4816]: I0316 00:14:23.540167 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="d54cdab6-e848-4a30-b64f-7b257a403479" containerName="controller-manager" Mar 16 00:14:23 crc kubenswrapper[4816]: I0316 00:14:23.540185 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="ee03d74f-fc71-4caa-b296-7bde75124d84" containerName="route-controller-manager" Mar 16 00:14:23 crc kubenswrapper[4816]: I0316 00:14:23.540196 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="5160d394-3d9b-4066-9bea-b9dd787b2a42" containerName="oc" Mar 16 00:14:23 crc kubenswrapper[4816]: I0316 00:14:23.540676 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5cff4cf95-6lkfg" Mar 16 00:14:23 crc kubenswrapper[4816]: I0316 00:14:23.544400 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 16 00:14:23 crc kubenswrapper[4816]: I0316 00:14:23.544482 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 16 00:14:23 crc kubenswrapper[4816]: I0316 00:14:23.544405 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 16 00:14:23 crc kubenswrapper[4816]: I0316 00:14:23.545798 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 16 00:14:23 crc kubenswrapper[4816]: I0316 00:14:23.546149 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 16 00:14:23 crc kubenswrapper[4816]: I0316 00:14:23.548175 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-5959f55db6-fjth7"] Mar 16 00:14:23 crc kubenswrapper[4816]: I0316 00:14:23.549652 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5959f55db6-fjth7" Mar 16 00:14:23 crc kubenswrapper[4816]: I0316 00:14:23.549874 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 16 00:14:23 crc kubenswrapper[4816]: I0316 00:14:23.555862 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 16 00:14:23 crc kubenswrapper[4816]: I0316 00:14:23.556349 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 16 00:14:23 crc kubenswrapper[4816]: I0316 00:14:23.556484 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5cff4cf95-6lkfg"] Mar 16 00:14:23 crc kubenswrapper[4816]: I0316 00:14:23.557120 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 16 00:14:23 crc kubenswrapper[4816]: I0316 00:14:23.557465 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 16 00:14:23 crc kubenswrapper[4816]: I0316 00:14:23.560123 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 16 00:14:23 crc kubenswrapper[4816]: I0316 00:14:23.561085 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 16 00:14:23 crc kubenswrapper[4816]: I0316 00:14:23.570302 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-5959f55db6-fjth7"] Mar 16 00:14:23 crc kubenswrapper[4816]: I0316 00:14:23.572798 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 16 00:14:23 crc kubenswrapper[4816]: I0316 00:14:23.673089 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d54cdab6-e848-4a30-b64f-7b257a403479" path="/var/lib/kubelet/pods/d54cdab6-e848-4a30-b64f-7b257a403479/volumes" Mar 16 00:14:23 crc kubenswrapper[4816]: I0316 00:14:23.673596 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ee03d74f-fc71-4caa-b296-7bde75124d84" path="/var/lib/kubelet/pods/ee03d74f-fc71-4caa-b296-7bde75124d84/volumes" Mar 16 00:14:23 crc kubenswrapper[4816]: I0316 00:14:23.741772 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bd82845b-11d2-4f56-baef-9217ec8fb5d9-serving-cert\") pod \"controller-manager-5959f55db6-fjth7\" (UID: \"bd82845b-11d2-4f56-baef-9217ec8fb5d9\") " pod="openshift-controller-manager/controller-manager-5959f55db6-fjth7" Mar 16 00:14:23 crc kubenswrapper[4816]: I0316 00:14:23.741841 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/52abc92a-155b-4167-826a-de9f1aa0ce44-serving-cert\") pod \"route-controller-manager-5cff4cf95-6lkfg\" (UID: \"52abc92a-155b-4167-826a-de9f1aa0ce44\") " pod="openshift-route-controller-manager/route-controller-manager-5cff4cf95-6lkfg" Mar 16 00:14:23 crc kubenswrapper[4816]: I0316 00:14:23.741922 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/bd82845b-11d2-4f56-baef-9217ec8fb5d9-proxy-ca-bundles\") pod \"controller-manager-5959f55db6-fjth7\" (UID: \"bd82845b-11d2-4f56-baef-9217ec8fb5d9\") " pod="openshift-controller-manager/controller-manager-5959f55db6-fjth7" Mar 16 00:14:23 crc kubenswrapper[4816]: I0316 00:14:23.741952 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c8dvk\" (UniqueName: \"kubernetes.io/projected/bd82845b-11d2-4f56-baef-9217ec8fb5d9-kube-api-access-c8dvk\") pod \"controller-manager-5959f55db6-fjth7\" (UID: \"bd82845b-11d2-4f56-baef-9217ec8fb5d9\") " pod="openshift-controller-manager/controller-manager-5959f55db6-fjth7" Mar 16 00:14:23 crc kubenswrapper[4816]: I0316 00:14:23.741972 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6lmzh\" (UniqueName: \"kubernetes.io/projected/52abc92a-155b-4167-826a-de9f1aa0ce44-kube-api-access-6lmzh\") pod \"route-controller-manager-5cff4cf95-6lkfg\" (UID: \"52abc92a-155b-4167-826a-de9f1aa0ce44\") " pod="openshift-route-controller-manager/route-controller-manager-5cff4cf95-6lkfg" Mar 16 00:14:23 crc kubenswrapper[4816]: I0316 00:14:23.741993 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/52abc92a-155b-4167-826a-de9f1aa0ce44-client-ca\") pod \"route-controller-manager-5cff4cf95-6lkfg\" (UID: \"52abc92a-155b-4167-826a-de9f1aa0ce44\") " pod="openshift-route-controller-manager/route-controller-manager-5cff4cf95-6lkfg" Mar 16 00:14:23 crc kubenswrapper[4816]: I0316 00:14:23.742023 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bd82845b-11d2-4f56-baef-9217ec8fb5d9-config\") pod \"controller-manager-5959f55db6-fjth7\" (UID: \"bd82845b-11d2-4f56-baef-9217ec8fb5d9\") " pod="openshift-controller-manager/controller-manager-5959f55db6-fjth7" Mar 16 00:14:23 crc kubenswrapper[4816]: I0316 00:14:23.742043 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/bd82845b-11d2-4f56-baef-9217ec8fb5d9-client-ca\") pod \"controller-manager-5959f55db6-fjth7\" (UID: \"bd82845b-11d2-4f56-baef-9217ec8fb5d9\") " pod="openshift-controller-manager/controller-manager-5959f55db6-fjth7" Mar 16 00:14:23 crc kubenswrapper[4816]: I0316 00:14:23.742070 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/52abc92a-155b-4167-826a-de9f1aa0ce44-config\") pod \"route-controller-manager-5cff4cf95-6lkfg\" (UID: \"52abc92a-155b-4167-826a-de9f1aa0ce44\") " pod="openshift-route-controller-manager/route-controller-manager-5cff4cf95-6lkfg" Mar 16 00:14:23 crc kubenswrapper[4816]: I0316 00:14:23.843322 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/52abc92a-155b-4167-826a-de9f1aa0ce44-client-ca\") pod \"route-controller-manager-5cff4cf95-6lkfg\" (UID: \"52abc92a-155b-4167-826a-de9f1aa0ce44\") " pod="openshift-route-controller-manager/route-controller-manager-5cff4cf95-6lkfg" Mar 16 00:14:23 crc kubenswrapper[4816]: I0316 00:14:23.843425 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bd82845b-11d2-4f56-baef-9217ec8fb5d9-config\") pod \"controller-manager-5959f55db6-fjth7\" (UID: \"bd82845b-11d2-4f56-baef-9217ec8fb5d9\") " pod="openshift-controller-manager/controller-manager-5959f55db6-fjth7" Mar 16 00:14:23 crc kubenswrapper[4816]: I0316 00:14:23.843502 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/bd82845b-11d2-4f56-baef-9217ec8fb5d9-client-ca\") pod \"controller-manager-5959f55db6-fjth7\" (UID: \"bd82845b-11d2-4f56-baef-9217ec8fb5d9\") " pod="openshift-controller-manager/controller-manager-5959f55db6-fjth7" Mar 16 00:14:23 crc kubenswrapper[4816]: I0316 00:14:23.843612 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/52abc92a-155b-4167-826a-de9f1aa0ce44-config\") pod \"route-controller-manager-5cff4cf95-6lkfg\" (UID: \"52abc92a-155b-4167-826a-de9f1aa0ce44\") " pod="openshift-route-controller-manager/route-controller-manager-5cff4cf95-6lkfg" Mar 16 00:14:23 crc kubenswrapper[4816]: I0316 00:14:23.843712 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bd82845b-11d2-4f56-baef-9217ec8fb5d9-serving-cert\") pod \"controller-manager-5959f55db6-fjth7\" (UID: \"bd82845b-11d2-4f56-baef-9217ec8fb5d9\") " pod="openshift-controller-manager/controller-manager-5959f55db6-fjth7" Mar 16 00:14:23 crc kubenswrapper[4816]: I0316 00:14:23.843784 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/52abc92a-155b-4167-826a-de9f1aa0ce44-serving-cert\") pod \"route-controller-manager-5cff4cf95-6lkfg\" (UID: \"52abc92a-155b-4167-826a-de9f1aa0ce44\") " pod="openshift-route-controller-manager/route-controller-manager-5cff4cf95-6lkfg" Mar 16 00:14:23 crc kubenswrapper[4816]: I0316 00:14:23.843855 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/bd82845b-11d2-4f56-baef-9217ec8fb5d9-proxy-ca-bundles\") pod \"controller-manager-5959f55db6-fjth7\" (UID: \"bd82845b-11d2-4f56-baef-9217ec8fb5d9\") " pod="openshift-controller-manager/controller-manager-5959f55db6-fjth7" Mar 16 00:14:23 crc kubenswrapper[4816]: I0316 00:14:23.843903 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c8dvk\" (UniqueName: \"kubernetes.io/projected/bd82845b-11d2-4f56-baef-9217ec8fb5d9-kube-api-access-c8dvk\") pod \"controller-manager-5959f55db6-fjth7\" (UID: \"bd82845b-11d2-4f56-baef-9217ec8fb5d9\") " pod="openshift-controller-manager/controller-manager-5959f55db6-fjth7" Mar 16 00:14:23 crc kubenswrapper[4816]: I0316 00:14:23.843953 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6lmzh\" (UniqueName: \"kubernetes.io/projected/52abc92a-155b-4167-826a-de9f1aa0ce44-kube-api-access-6lmzh\") pod \"route-controller-manager-5cff4cf95-6lkfg\" (UID: \"52abc92a-155b-4167-826a-de9f1aa0ce44\") " pod="openshift-route-controller-manager/route-controller-manager-5cff4cf95-6lkfg" Mar 16 00:14:23 crc kubenswrapper[4816]: I0316 00:14:23.845531 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/52abc92a-155b-4167-826a-de9f1aa0ce44-client-ca\") pod \"route-controller-manager-5cff4cf95-6lkfg\" (UID: \"52abc92a-155b-4167-826a-de9f1aa0ce44\") " pod="openshift-route-controller-manager/route-controller-manager-5cff4cf95-6lkfg" Mar 16 00:14:23 crc kubenswrapper[4816]: I0316 00:14:23.846351 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/52abc92a-155b-4167-826a-de9f1aa0ce44-config\") pod \"route-controller-manager-5cff4cf95-6lkfg\" (UID: \"52abc92a-155b-4167-826a-de9f1aa0ce44\") " pod="openshift-route-controller-manager/route-controller-manager-5cff4cf95-6lkfg" Mar 16 00:14:23 crc kubenswrapper[4816]: I0316 00:14:23.846455 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bd82845b-11d2-4f56-baef-9217ec8fb5d9-config\") pod \"controller-manager-5959f55db6-fjth7\" (UID: \"bd82845b-11d2-4f56-baef-9217ec8fb5d9\") " pod="openshift-controller-manager/controller-manager-5959f55db6-fjth7" Mar 16 00:14:23 crc kubenswrapper[4816]: I0316 00:14:23.846532 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/bd82845b-11d2-4f56-baef-9217ec8fb5d9-client-ca\") pod \"controller-manager-5959f55db6-fjth7\" (UID: \"bd82845b-11d2-4f56-baef-9217ec8fb5d9\") " pod="openshift-controller-manager/controller-manager-5959f55db6-fjth7" Mar 16 00:14:23 crc kubenswrapper[4816]: I0316 00:14:23.847389 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/bd82845b-11d2-4f56-baef-9217ec8fb5d9-proxy-ca-bundles\") pod \"controller-manager-5959f55db6-fjth7\" (UID: \"bd82845b-11d2-4f56-baef-9217ec8fb5d9\") " pod="openshift-controller-manager/controller-manager-5959f55db6-fjth7" Mar 16 00:14:23 crc kubenswrapper[4816]: I0316 00:14:23.850086 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/52abc92a-155b-4167-826a-de9f1aa0ce44-serving-cert\") pod \"route-controller-manager-5cff4cf95-6lkfg\" (UID: \"52abc92a-155b-4167-826a-de9f1aa0ce44\") " pod="openshift-route-controller-manager/route-controller-manager-5cff4cf95-6lkfg" Mar 16 00:14:23 crc kubenswrapper[4816]: I0316 00:14:23.857722 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bd82845b-11d2-4f56-baef-9217ec8fb5d9-serving-cert\") pod \"controller-manager-5959f55db6-fjth7\" (UID: \"bd82845b-11d2-4f56-baef-9217ec8fb5d9\") " pod="openshift-controller-manager/controller-manager-5959f55db6-fjth7" Mar 16 00:14:23 crc kubenswrapper[4816]: I0316 00:14:23.876984 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6lmzh\" (UniqueName: \"kubernetes.io/projected/52abc92a-155b-4167-826a-de9f1aa0ce44-kube-api-access-6lmzh\") pod \"route-controller-manager-5cff4cf95-6lkfg\" (UID: \"52abc92a-155b-4167-826a-de9f1aa0ce44\") " pod="openshift-route-controller-manager/route-controller-manager-5cff4cf95-6lkfg" Mar 16 00:14:23 crc kubenswrapper[4816]: I0316 00:14:23.880883 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c8dvk\" (UniqueName: \"kubernetes.io/projected/bd82845b-11d2-4f56-baef-9217ec8fb5d9-kube-api-access-c8dvk\") pod \"controller-manager-5959f55db6-fjth7\" (UID: \"bd82845b-11d2-4f56-baef-9217ec8fb5d9\") " pod="openshift-controller-manager/controller-manager-5959f55db6-fjth7" Mar 16 00:14:23 crc kubenswrapper[4816]: I0316 00:14:23.881107 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5cff4cf95-6lkfg" Mar 16 00:14:23 crc kubenswrapper[4816]: I0316 00:14:23.900786 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5959f55db6-fjth7" Mar 16 00:14:24 crc kubenswrapper[4816]: I0316 00:14:24.206084 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-5959f55db6-fjth7"] Mar 16 00:14:24 crc kubenswrapper[4816]: W0316 00:14:24.342067 4816 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod52abc92a_155b_4167_826a_de9f1aa0ce44.slice/crio-47458505eb5293d436500049d82178fbb3a456e34ebfa3f32d2a53fddf62df0c WatchSource:0}: Error finding container 47458505eb5293d436500049d82178fbb3a456e34ebfa3f32d2a53fddf62df0c: Status 404 returned error can't find the container with id 47458505eb5293d436500049d82178fbb3a456e34ebfa3f32d2a53fddf62df0c Mar 16 00:14:24 crc kubenswrapper[4816]: I0316 00:14:24.343317 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5cff4cf95-6lkfg"] Mar 16 00:14:25 crc kubenswrapper[4816]: I0316 00:14:25.067672 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5959f55db6-fjth7" event={"ID":"bd82845b-11d2-4f56-baef-9217ec8fb5d9","Type":"ContainerStarted","Data":"2e215180fdce74e05454572f7c396d2d53b12a2397f880975a9a7a4cdbd2b141"} Mar 16 00:14:25 crc kubenswrapper[4816]: I0316 00:14:25.067997 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5959f55db6-fjth7" event={"ID":"bd82845b-11d2-4f56-baef-9217ec8fb5d9","Type":"ContainerStarted","Data":"12fb963e32c36ab5a15988d6a8d9cbcef352176f4c3d4d9612d64e86b4de2ee1"} Mar 16 00:14:25 crc kubenswrapper[4816]: I0316 00:14:25.068023 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-5959f55db6-fjth7" Mar 16 00:14:25 crc kubenswrapper[4816]: I0316 00:14:25.069788 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5cff4cf95-6lkfg" event={"ID":"52abc92a-155b-4167-826a-de9f1aa0ce44","Type":"ContainerStarted","Data":"91e8e1aaa2727c987a514ce7c4c9cf6da470e267ad18af4e3a5c0d7c59840589"} Mar 16 00:14:25 crc kubenswrapper[4816]: I0316 00:14:25.069817 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5cff4cf95-6lkfg" event={"ID":"52abc92a-155b-4167-826a-de9f1aa0ce44","Type":"ContainerStarted","Data":"47458505eb5293d436500049d82178fbb3a456e34ebfa3f32d2a53fddf62df0c"} Mar 16 00:14:25 crc kubenswrapper[4816]: I0316 00:14:25.070038 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-5cff4cf95-6lkfg" Mar 16 00:14:25 crc kubenswrapper[4816]: I0316 00:14:25.074006 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-5959f55db6-fjth7" Mar 16 00:14:25 crc kubenswrapper[4816]: I0316 00:14:25.075851 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-5cff4cf95-6lkfg" Mar 16 00:14:25 crc kubenswrapper[4816]: I0316 00:14:25.090987 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-5959f55db6-fjth7" podStartSLOduration=4.090967533 podStartE2EDuration="4.090967533s" podCreationTimestamp="2026-03-16 00:14:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-16 00:14:25.087202793 +0000 UTC m=+458.183502756" watchObservedRunningTime="2026-03-16 00:14:25.090967533 +0000 UTC m=+458.187267506" Mar 16 00:14:25 crc kubenswrapper[4816]: I0316 00:14:25.111014 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-5cff4cf95-6lkfg" podStartSLOduration=4.110988507 podStartE2EDuration="4.110988507s" podCreationTimestamp="2026-03-16 00:14:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-16 00:14:25.104176008 +0000 UTC m=+458.200475971" watchObservedRunningTime="2026-03-16 00:14:25.110988507 +0000 UTC m=+458.207288500" Mar 16 00:14:31 crc kubenswrapper[4816]: I0316 00:14:31.863368 4816 patch_prober.go:28] interesting pod/machine-config-daemon-jrdcz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 16 00:14:31 crc kubenswrapper[4816]: I0316 00:14:31.864062 4816 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jrdcz" podUID="dd08ece2-7636-4966-973a-e96a34b70b53" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 16 00:14:47 crc kubenswrapper[4816]: I0316 00:14:47.421644 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-sshl5"] Mar 16 00:15:00 crc kubenswrapper[4816]: I0316 00:15:00.141888 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29560335-p96p7"] Mar 16 00:15:00 crc kubenswrapper[4816]: I0316 00:15:00.143063 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29560335-p96p7" Mar 16 00:15:00 crc kubenswrapper[4816]: I0316 00:15:00.145651 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 16 00:15:00 crc kubenswrapper[4816]: I0316 00:15:00.145763 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 16 00:15:00 crc kubenswrapper[4816]: I0316 00:15:00.154453 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29560335-p96p7"] Mar 16 00:15:00 crc kubenswrapper[4816]: I0316 00:15:00.315952 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/3a9c69dc-0684-421d-a7aa-6fb257f59909-secret-volume\") pod \"collect-profiles-29560335-p96p7\" (UID: \"3a9c69dc-0684-421d-a7aa-6fb257f59909\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29560335-p96p7" Mar 16 00:15:00 crc kubenswrapper[4816]: I0316 00:15:00.316056 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t9lxp\" (UniqueName: \"kubernetes.io/projected/3a9c69dc-0684-421d-a7aa-6fb257f59909-kube-api-access-t9lxp\") pod \"collect-profiles-29560335-p96p7\" (UID: \"3a9c69dc-0684-421d-a7aa-6fb257f59909\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29560335-p96p7" Mar 16 00:15:00 crc kubenswrapper[4816]: I0316 00:15:00.316180 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3a9c69dc-0684-421d-a7aa-6fb257f59909-config-volume\") pod \"collect-profiles-29560335-p96p7\" (UID: \"3a9c69dc-0684-421d-a7aa-6fb257f59909\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29560335-p96p7" Mar 16 00:15:00 crc kubenswrapper[4816]: I0316 00:15:00.416931 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/3a9c69dc-0684-421d-a7aa-6fb257f59909-secret-volume\") pod \"collect-profiles-29560335-p96p7\" (UID: \"3a9c69dc-0684-421d-a7aa-6fb257f59909\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29560335-p96p7" Mar 16 00:15:00 crc kubenswrapper[4816]: I0316 00:15:00.416984 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t9lxp\" (UniqueName: \"kubernetes.io/projected/3a9c69dc-0684-421d-a7aa-6fb257f59909-kube-api-access-t9lxp\") pod \"collect-profiles-29560335-p96p7\" (UID: \"3a9c69dc-0684-421d-a7aa-6fb257f59909\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29560335-p96p7" Mar 16 00:15:00 crc kubenswrapper[4816]: I0316 00:15:00.417032 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3a9c69dc-0684-421d-a7aa-6fb257f59909-config-volume\") pod \"collect-profiles-29560335-p96p7\" (UID: \"3a9c69dc-0684-421d-a7aa-6fb257f59909\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29560335-p96p7" Mar 16 00:15:00 crc kubenswrapper[4816]: I0316 00:15:00.417993 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3a9c69dc-0684-421d-a7aa-6fb257f59909-config-volume\") pod \"collect-profiles-29560335-p96p7\" (UID: \"3a9c69dc-0684-421d-a7aa-6fb257f59909\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29560335-p96p7" Mar 16 00:15:00 crc kubenswrapper[4816]: I0316 00:15:00.423807 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/3a9c69dc-0684-421d-a7aa-6fb257f59909-secret-volume\") pod \"collect-profiles-29560335-p96p7\" (UID: \"3a9c69dc-0684-421d-a7aa-6fb257f59909\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29560335-p96p7" Mar 16 00:15:00 crc kubenswrapper[4816]: I0316 00:15:00.439201 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t9lxp\" (UniqueName: \"kubernetes.io/projected/3a9c69dc-0684-421d-a7aa-6fb257f59909-kube-api-access-t9lxp\") pod \"collect-profiles-29560335-p96p7\" (UID: \"3a9c69dc-0684-421d-a7aa-6fb257f59909\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29560335-p96p7" Mar 16 00:15:00 crc kubenswrapper[4816]: I0316 00:15:00.462820 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29560335-p96p7" Mar 16 00:15:00 crc kubenswrapper[4816]: I0316 00:15:00.861877 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29560335-p96p7"] Mar 16 00:15:01 crc kubenswrapper[4816]: I0316 00:15:01.305683 4816 generic.go:334] "Generic (PLEG): container finished" podID="3a9c69dc-0684-421d-a7aa-6fb257f59909" containerID="2f274fd6c476ac0e73a688a7bbce794f8e491674e1b0838204690a79b7a28dfd" exitCode=0 Mar 16 00:15:01 crc kubenswrapper[4816]: I0316 00:15:01.305743 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29560335-p96p7" event={"ID":"3a9c69dc-0684-421d-a7aa-6fb257f59909","Type":"ContainerDied","Data":"2f274fd6c476ac0e73a688a7bbce794f8e491674e1b0838204690a79b7a28dfd"} Mar 16 00:15:01 crc kubenswrapper[4816]: I0316 00:15:01.305772 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29560335-p96p7" event={"ID":"3a9c69dc-0684-421d-a7aa-6fb257f59909","Type":"ContainerStarted","Data":"648ab9f6adc93138c8533a5442aa38b4b2995f055ec75688ef2547b9f3713571"} Mar 16 00:15:01 crc kubenswrapper[4816]: I0316 00:15:01.863653 4816 patch_prober.go:28] interesting pod/machine-config-daemon-jrdcz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 16 00:15:01 crc kubenswrapper[4816]: I0316 00:15:01.863749 4816 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jrdcz" podUID="dd08ece2-7636-4966-973a-e96a34b70b53" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 16 00:15:02 crc kubenswrapper[4816]: I0316 00:15:02.673026 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29560335-p96p7" Mar 16 00:15:02 crc kubenswrapper[4816]: I0316 00:15:02.860157 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3a9c69dc-0684-421d-a7aa-6fb257f59909-config-volume\") pod \"3a9c69dc-0684-421d-a7aa-6fb257f59909\" (UID: \"3a9c69dc-0684-421d-a7aa-6fb257f59909\") " Mar 16 00:15:02 crc kubenswrapper[4816]: I0316 00:15:02.860248 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/3a9c69dc-0684-421d-a7aa-6fb257f59909-secret-volume\") pod \"3a9c69dc-0684-421d-a7aa-6fb257f59909\" (UID: \"3a9c69dc-0684-421d-a7aa-6fb257f59909\") " Mar 16 00:15:02 crc kubenswrapper[4816]: I0316 00:15:02.860286 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t9lxp\" (UniqueName: \"kubernetes.io/projected/3a9c69dc-0684-421d-a7aa-6fb257f59909-kube-api-access-t9lxp\") pod \"3a9c69dc-0684-421d-a7aa-6fb257f59909\" (UID: \"3a9c69dc-0684-421d-a7aa-6fb257f59909\") " Mar 16 00:15:02 crc kubenswrapper[4816]: I0316 00:15:02.861264 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3a9c69dc-0684-421d-a7aa-6fb257f59909-config-volume" (OuterVolumeSpecName: "config-volume") pod "3a9c69dc-0684-421d-a7aa-6fb257f59909" (UID: "3a9c69dc-0684-421d-a7aa-6fb257f59909"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:15:02 crc kubenswrapper[4816]: I0316 00:15:02.871358 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3a9c69dc-0684-421d-a7aa-6fb257f59909-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "3a9c69dc-0684-421d-a7aa-6fb257f59909" (UID: "3a9c69dc-0684-421d-a7aa-6fb257f59909"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 16 00:15:02 crc kubenswrapper[4816]: I0316 00:15:02.871543 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3a9c69dc-0684-421d-a7aa-6fb257f59909-kube-api-access-t9lxp" (OuterVolumeSpecName: "kube-api-access-t9lxp") pod "3a9c69dc-0684-421d-a7aa-6fb257f59909" (UID: "3a9c69dc-0684-421d-a7aa-6fb257f59909"). InnerVolumeSpecName "kube-api-access-t9lxp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:15:02 crc kubenswrapper[4816]: I0316 00:15:02.961584 4816 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/3a9c69dc-0684-421d-a7aa-6fb257f59909-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 16 00:15:02 crc kubenswrapper[4816]: I0316 00:15:02.961643 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t9lxp\" (UniqueName: \"kubernetes.io/projected/3a9c69dc-0684-421d-a7aa-6fb257f59909-kube-api-access-t9lxp\") on node \"crc\" DevicePath \"\"" Mar 16 00:15:02 crc kubenswrapper[4816]: I0316 00:15:02.961664 4816 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3a9c69dc-0684-421d-a7aa-6fb257f59909-config-volume\") on node \"crc\" DevicePath \"\"" Mar 16 00:15:03 crc kubenswrapper[4816]: I0316 00:15:03.318476 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29560335-p96p7" event={"ID":"3a9c69dc-0684-421d-a7aa-6fb257f59909","Type":"ContainerDied","Data":"648ab9f6adc93138c8533a5442aa38b4b2995f055ec75688ef2547b9f3713571"} Mar 16 00:15:03 crc kubenswrapper[4816]: I0316 00:15:03.318532 4816 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="648ab9f6adc93138c8533a5442aa38b4b2995f055ec75688ef2547b9f3713571" Mar 16 00:15:03 crc kubenswrapper[4816]: I0316 00:15:03.318611 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29560335-p96p7" Mar 16 00:15:12 crc kubenswrapper[4816]: I0316 00:15:12.452175 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-authentication/oauth-openshift-558db77b4-sshl5" podUID="7c3e347f-464a-43f1-bf29-689bf81a28e6" containerName="oauth-openshift" containerID="cri-o://e323e811f301454593a76b4a27e968d571ea79e5909647b904b1cad07862ea62" gracePeriod=15 Mar 16 00:15:12 crc kubenswrapper[4816]: I0316 00:15:12.898413 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-sshl5" Mar 16 00:15:12 crc kubenswrapper[4816]: I0316 00:15:12.907436 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/7c3e347f-464a-43f1-bf29-689bf81a28e6-v4-0-config-user-idp-0-file-data\") pod \"7c3e347f-464a-43f1-bf29-689bf81a28e6\" (UID: \"7c3e347f-464a-43f1-bf29-689bf81a28e6\") " Mar 16 00:15:12 crc kubenswrapper[4816]: I0316 00:15:12.907533 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/7c3e347f-464a-43f1-bf29-689bf81a28e6-v4-0-config-system-ocp-branding-template\") pod \"7c3e347f-464a-43f1-bf29-689bf81a28e6\" (UID: \"7c3e347f-464a-43f1-bf29-689bf81a28e6\") " Mar 16 00:15:12 crc kubenswrapper[4816]: I0316 00:15:12.907585 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/7c3e347f-464a-43f1-bf29-689bf81a28e6-v4-0-config-system-cliconfig\") pod \"7c3e347f-464a-43f1-bf29-689bf81a28e6\" (UID: \"7c3e347f-464a-43f1-bf29-689bf81a28e6\") " Mar 16 00:15:12 crc kubenswrapper[4816]: I0316 00:15:12.907607 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/7c3e347f-464a-43f1-bf29-689bf81a28e6-v4-0-config-user-template-error\") pod \"7c3e347f-464a-43f1-bf29-689bf81a28e6\" (UID: \"7c3e347f-464a-43f1-bf29-689bf81a28e6\") " Mar 16 00:15:12 crc kubenswrapper[4816]: I0316 00:15:12.907641 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/7c3e347f-464a-43f1-bf29-689bf81a28e6-v4-0-config-system-serving-cert\") pod \"7c3e347f-464a-43f1-bf29-689bf81a28e6\" (UID: \"7c3e347f-464a-43f1-bf29-689bf81a28e6\") " Mar 16 00:15:12 crc kubenswrapper[4816]: I0316 00:15:12.907670 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/7c3e347f-464a-43f1-bf29-689bf81a28e6-v4-0-config-system-service-ca\") pod \"7c3e347f-464a-43f1-bf29-689bf81a28e6\" (UID: \"7c3e347f-464a-43f1-bf29-689bf81a28e6\") " Mar 16 00:15:12 crc kubenswrapper[4816]: I0316 00:15:12.907701 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/7c3e347f-464a-43f1-bf29-689bf81a28e6-v4-0-config-system-router-certs\") pod \"7c3e347f-464a-43f1-bf29-689bf81a28e6\" (UID: \"7c3e347f-464a-43f1-bf29-689bf81a28e6\") " Mar 16 00:15:12 crc kubenswrapper[4816]: I0316 00:15:12.907724 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/7c3e347f-464a-43f1-bf29-689bf81a28e6-v4-0-config-system-session\") pod \"7c3e347f-464a-43f1-bf29-689bf81a28e6\" (UID: \"7c3e347f-464a-43f1-bf29-689bf81a28e6\") " Mar 16 00:15:12 crc kubenswrapper[4816]: I0316 00:15:12.907751 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/7c3e347f-464a-43f1-bf29-689bf81a28e6-v4-0-config-user-template-provider-selection\") pod \"7c3e347f-464a-43f1-bf29-689bf81a28e6\" (UID: \"7c3e347f-464a-43f1-bf29-689bf81a28e6\") " Mar 16 00:15:12 crc kubenswrapper[4816]: I0316 00:15:12.907794 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7c3e347f-464a-43f1-bf29-689bf81a28e6-v4-0-config-system-trusted-ca-bundle\") pod \"7c3e347f-464a-43f1-bf29-689bf81a28e6\" (UID: \"7c3e347f-464a-43f1-bf29-689bf81a28e6\") " Mar 16 00:15:12 crc kubenswrapper[4816]: I0316 00:15:12.907826 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/7c3e347f-464a-43f1-bf29-689bf81a28e6-audit-policies\") pod \"7c3e347f-464a-43f1-bf29-689bf81a28e6\" (UID: \"7c3e347f-464a-43f1-bf29-689bf81a28e6\") " Mar 16 00:15:12 crc kubenswrapper[4816]: I0316 00:15:12.907854 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kxdqt\" (UniqueName: \"kubernetes.io/projected/7c3e347f-464a-43f1-bf29-689bf81a28e6-kube-api-access-kxdqt\") pod \"7c3e347f-464a-43f1-bf29-689bf81a28e6\" (UID: \"7c3e347f-464a-43f1-bf29-689bf81a28e6\") " Mar 16 00:15:12 crc kubenswrapper[4816]: I0316 00:15:12.907876 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/7c3e347f-464a-43f1-bf29-689bf81a28e6-v4-0-config-user-template-login\") pod \"7c3e347f-464a-43f1-bf29-689bf81a28e6\" (UID: \"7c3e347f-464a-43f1-bf29-689bf81a28e6\") " Mar 16 00:15:12 crc kubenswrapper[4816]: I0316 00:15:12.907902 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/7c3e347f-464a-43f1-bf29-689bf81a28e6-audit-dir\") pod \"7c3e347f-464a-43f1-bf29-689bf81a28e6\" (UID: \"7c3e347f-464a-43f1-bf29-689bf81a28e6\") " Mar 16 00:15:12 crc kubenswrapper[4816]: I0316 00:15:12.908203 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7c3e347f-464a-43f1-bf29-689bf81a28e6-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "7c3e347f-464a-43f1-bf29-689bf81a28e6" (UID: "7c3e347f-464a-43f1-bf29-689bf81a28e6"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 16 00:15:12 crc kubenswrapper[4816]: I0316 00:15:12.910078 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7c3e347f-464a-43f1-bf29-689bf81a28e6-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "7c3e347f-464a-43f1-bf29-689bf81a28e6" (UID: "7c3e347f-464a-43f1-bf29-689bf81a28e6"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:15:12 crc kubenswrapper[4816]: I0316 00:15:12.910231 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7c3e347f-464a-43f1-bf29-689bf81a28e6-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "7c3e347f-464a-43f1-bf29-689bf81a28e6" (UID: "7c3e347f-464a-43f1-bf29-689bf81a28e6"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:15:12 crc kubenswrapper[4816]: I0316 00:15:12.910466 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7c3e347f-464a-43f1-bf29-689bf81a28e6-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "7c3e347f-464a-43f1-bf29-689bf81a28e6" (UID: "7c3e347f-464a-43f1-bf29-689bf81a28e6"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:15:12 crc kubenswrapper[4816]: I0316 00:15:12.911735 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7c3e347f-464a-43f1-bf29-689bf81a28e6-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "7c3e347f-464a-43f1-bf29-689bf81a28e6" (UID: "7c3e347f-464a-43f1-bf29-689bf81a28e6"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:15:12 crc kubenswrapper[4816]: I0316 00:15:12.914733 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7c3e347f-464a-43f1-bf29-689bf81a28e6-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "7c3e347f-464a-43f1-bf29-689bf81a28e6" (UID: "7c3e347f-464a-43f1-bf29-689bf81a28e6"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 16 00:15:12 crc kubenswrapper[4816]: I0316 00:15:12.914979 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7c3e347f-464a-43f1-bf29-689bf81a28e6-kube-api-access-kxdqt" (OuterVolumeSpecName: "kube-api-access-kxdqt") pod "7c3e347f-464a-43f1-bf29-689bf81a28e6" (UID: "7c3e347f-464a-43f1-bf29-689bf81a28e6"). InnerVolumeSpecName "kube-api-access-kxdqt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:15:12 crc kubenswrapper[4816]: I0316 00:15:12.915962 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7c3e347f-464a-43f1-bf29-689bf81a28e6-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "7c3e347f-464a-43f1-bf29-689bf81a28e6" (UID: "7c3e347f-464a-43f1-bf29-689bf81a28e6"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 16 00:15:12 crc kubenswrapper[4816]: I0316 00:15:12.916402 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7c3e347f-464a-43f1-bf29-689bf81a28e6-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "7c3e347f-464a-43f1-bf29-689bf81a28e6" (UID: "7c3e347f-464a-43f1-bf29-689bf81a28e6"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 16 00:15:12 crc kubenswrapper[4816]: I0316 00:15:12.916670 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7c3e347f-464a-43f1-bf29-689bf81a28e6-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "7c3e347f-464a-43f1-bf29-689bf81a28e6" (UID: "7c3e347f-464a-43f1-bf29-689bf81a28e6"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 16 00:15:12 crc kubenswrapper[4816]: I0316 00:15:12.916933 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7c3e347f-464a-43f1-bf29-689bf81a28e6-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "7c3e347f-464a-43f1-bf29-689bf81a28e6" (UID: "7c3e347f-464a-43f1-bf29-689bf81a28e6"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 16 00:15:12 crc kubenswrapper[4816]: I0316 00:15:12.917094 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7c3e347f-464a-43f1-bf29-689bf81a28e6-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "7c3e347f-464a-43f1-bf29-689bf81a28e6" (UID: "7c3e347f-464a-43f1-bf29-689bf81a28e6"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 16 00:15:12 crc kubenswrapper[4816]: I0316 00:15:12.931718 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7c3e347f-464a-43f1-bf29-689bf81a28e6-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "7c3e347f-464a-43f1-bf29-689bf81a28e6" (UID: "7c3e347f-464a-43f1-bf29-689bf81a28e6"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 16 00:15:12 crc kubenswrapper[4816]: I0316 00:15:12.934743 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7c3e347f-464a-43f1-bf29-689bf81a28e6-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "7c3e347f-464a-43f1-bf29-689bf81a28e6" (UID: "7c3e347f-464a-43f1-bf29-689bf81a28e6"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 16 00:15:12 crc kubenswrapper[4816]: I0316 00:15:12.953413 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-6d789fbbf-wkdfq"] Mar 16 00:15:12 crc kubenswrapper[4816]: E0316 00:15:12.953682 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7c3e347f-464a-43f1-bf29-689bf81a28e6" containerName="oauth-openshift" Mar 16 00:15:12 crc kubenswrapper[4816]: I0316 00:15:12.953697 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="7c3e347f-464a-43f1-bf29-689bf81a28e6" containerName="oauth-openshift" Mar 16 00:15:12 crc kubenswrapper[4816]: E0316 00:15:12.953710 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3a9c69dc-0684-421d-a7aa-6fb257f59909" containerName="collect-profiles" Mar 16 00:15:12 crc kubenswrapper[4816]: I0316 00:15:12.953719 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="3a9c69dc-0684-421d-a7aa-6fb257f59909" containerName="collect-profiles" Mar 16 00:15:12 crc kubenswrapper[4816]: I0316 00:15:12.953904 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="3a9c69dc-0684-421d-a7aa-6fb257f59909" containerName="collect-profiles" Mar 16 00:15:12 crc kubenswrapper[4816]: I0316 00:15:12.953949 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="7c3e347f-464a-43f1-bf29-689bf81a28e6" containerName="oauth-openshift" Mar 16 00:15:12 crc kubenswrapper[4816]: I0316 00:15:12.954679 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-6d789fbbf-wkdfq" Mar 16 00:15:12 crc kubenswrapper[4816]: I0316 00:15:12.970169 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-6d789fbbf-wkdfq"] Mar 16 00:15:13 crc kubenswrapper[4816]: I0316 00:15:13.008895 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/26a979e8-1691-4390-82da-4229125eb297-v4-0-config-user-template-login\") pod \"oauth-openshift-6d789fbbf-wkdfq\" (UID: \"26a979e8-1691-4390-82da-4229125eb297\") " pod="openshift-authentication/oauth-openshift-6d789fbbf-wkdfq" Mar 16 00:15:13 crc kubenswrapper[4816]: I0316 00:15:13.008936 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/26a979e8-1691-4390-82da-4229125eb297-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-6d789fbbf-wkdfq\" (UID: \"26a979e8-1691-4390-82da-4229125eb297\") " pod="openshift-authentication/oauth-openshift-6d789fbbf-wkdfq" Mar 16 00:15:13 crc kubenswrapper[4816]: I0316 00:15:13.008965 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/26a979e8-1691-4390-82da-4229125eb297-v4-0-config-system-cliconfig\") pod \"oauth-openshift-6d789fbbf-wkdfq\" (UID: \"26a979e8-1691-4390-82da-4229125eb297\") " pod="openshift-authentication/oauth-openshift-6d789fbbf-wkdfq" Mar 16 00:15:13 crc kubenswrapper[4816]: I0316 00:15:13.008984 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/26a979e8-1691-4390-82da-4229125eb297-v4-0-config-system-router-certs\") pod \"oauth-openshift-6d789fbbf-wkdfq\" (UID: \"26a979e8-1691-4390-82da-4229125eb297\") " pod="openshift-authentication/oauth-openshift-6d789fbbf-wkdfq" Mar 16 00:15:13 crc kubenswrapper[4816]: I0316 00:15:13.009008 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/26a979e8-1691-4390-82da-4229125eb297-audit-dir\") pod \"oauth-openshift-6d789fbbf-wkdfq\" (UID: \"26a979e8-1691-4390-82da-4229125eb297\") " pod="openshift-authentication/oauth-openshift-6d789fbbf-wkdfq" Mar 16 00:15:13 crc kubenswrapper[4816]: I0316 00:15:13.009027 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/26a979e8-1691-4390-82da-4229125eb297-v4-0-config-system-session\") pod \"oauth-openshift-6d789fbbf-wkdfq\" (UID: \"26a979e8-1691-4390-82da-4229125eb297\") " pod="openshift-authentication/oauth-openshift-6d789fbbf-wkdfq" Mar 16 00:15:13 crc kubenswrapper[4816]: I0316 00:15:13.009045 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/26a979e8-1691-4390-82da-4229125eb297-v4-0-config-user-template-error\") pod \"oauth-openshift-6d789fbbf-wkdfq\" (UID: \"26a979e8-1691-4390-82da-4229125eb297\") " pod="openshift-authentication/oauth-openshift-6d789fbbf-wkdfq" Mar 16 00:15:13 crc kubenswrapper[4816]: I0316 00:15:13.009063 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/26a979e8-1691-4390-82da-4229125eb297-v4-0-config-system-serving-cert\") pod \"oauth-openshift-6d789fbbf-wkdfq\" (UID: \"26a979e8-1691-4390-82da-4229125eb297\") " pod="openshift-authentication/oauth-openshift-6d789fbbf-wkdfq" Mar 16 00:15:13 crc kubenswrapper[4816]: I0316 00:15:13.009080 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/26a979e8-1691-4390-82da-4229125eb297-v4-0-config-system-service-ca\") pod \"oauth-openshift-6d789fbbf-wkdfq\" (UID: \"26a979e8-1691-4390-82da-4229125eb297\") " pod="openshift-authentication/oauth-openshift-6d789fbbf-wkdfq" Mar 16 00:15:13 crc kubenswrapper[4816]: I0316 00:15:13.009102 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/26a979e8-1691-4390-82da-4229125eb297-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-6d789fbbf-wkdfq\" (UID: \"26a979e8-1691-4390-82da-4229125eb297\") " pod="openshift-authentication/oauth-openshift-6d789fbbf-wkdfq" Mar 16 00:15:13 crc kubenswrapper[4816]: I0316 00:15:13.009121 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/26a979e8-1691-4390-82da-4229125eb297-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-6d789fbbf-wkdfq\" (UID: \"26a979e8-1691-4390-82da-4229125eb297\") " pod="openshift-authentication/oauth-openshift-6d789fbbf-wkdfq" Mar 16 00:15:13 crc kubenswrapper[4816]: I0316 00:15:13.009140 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/26a979e8-1691-4390-82da-4229125eb297-audit-policies\") pod \"oauth-openshift-6d789fbbf-wkdfq\" (UID: \"26a979e8-1691-4390-82da-4229125eb297\") " pod="openshift-authentication/oauth-openshift-6d789fbbf-wkdfq" Mar 16 00:15:13 crc kubenswrapper[4816]: I0316 00:15:13.009157 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-js2jb\" (UniqueName: \"kubernetes.io/projected/26a979e8-1691-4390-82da-4229125eb297-kube-api-access-js2jb\") pod \"oauth-openshift-6d789fbbf-wkdfq\" (UID: \"26a979e8-1691-4390-82da-4229125eb297\") " pod="openshift-authentication/oauth-openshift-6d789fbbf-wkdfq" Mar 16 00:15:13 crc kubenswrapper[4816]: I0316 00:15:13.009178 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/26a979e8-1691-4390-82da-4229125eb297-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-6d789fbbf-wkdfq\" (UID: \"26a979e8-1691-4390-82da-4229125eb297\") " pod="openshift-authentication/oauth-openshift-6d789fbbf-wkdfq" Mar 16 00:15:13 crc kubenswrapper[4816]: I0316 00:15:13.009228 4816 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/7c3e347f-464a-43f1-bf29-689bf81a28e6-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Mar 16 00:15:13 crc kubenswrapper[4816]: I0316 00:15:13.009240 4816 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/7c3e347f-464a-43f1-bf29-689bf81a28e6-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Mar 16 00:15:13 crc kubenswrapper[4816]: I0316 00:15:13.009251 4816 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/7c3e347f-464a-43f1-bf29-689bf81a28e6-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Mar 16 00:15:13 crc kubenswrapper[4816]: I0316 00:15:13.009260 4816 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/7c3e347f-464a-43f1-bf29-689bf81a28e6-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Mar 16 00:15:13 crc kubenswrapper[4816]: I0316 00:15:13.009269 4816 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/7c3e347f-464a-43f1-bf29-689bf81a28e6-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 16 00:15:13 crc kubenswrapper[4816]: I0316 00:15:13.009278 4816 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/7c3e347f-464a-43f1-bf29-689bf81a28e6-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Mar 16 00:15:13 crc kubenswrapper[4816]: I0316 00:15:13.009288 4816 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/7c3e347f-464a-43f1-bf29-689bf81a28e6-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Mar 16 00:15:13 crc kubenswrapper[4816]: I0316 00:15:13.009297 4816 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/7c3e347f-464a-43f1-bf29-689bf81a28e6-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Mar 16 00:15:13 crc kubenswrapper[4816]: I0316 00:15:13.009307 4816 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/7c3e347f-464a-43f1-bf29-689bf81a28e6-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Mar 16 00:15:13 crc kubenswrapper[4816]: I0316 00:15:13.009319 4816 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7c3e347f-464a-43f1-bf29-689bf81a28e6-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 16 00:15:13 crc kubenswrapper[4816]: I0316 00:15:13.009329 4816 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/7c3e347f-464a-43f1-bf29-689bf81a28e6-audit-policies\") on node \"crc\" DevicePath \"\"" Mar 16 00:15:13 crc kubenswrapper[4816]: I0316 00:15:13.009338 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kxdqt\" (UniqueName: \"kubernetes.io/projected/7c3e347f-464a-43f1-bf29-689bf81a28e6-kube-api-access-kxdqt\") on node \"crc\" DevicePath \"\"" Mar 16 00:15:13 crc kubenswrapper[4816]: I0316 00:15:13.009349 4816 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/7c3e347f-464a-43f1-bf29-689bf81a28e6-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Mar 16 00:15:13 crc kubenswrapper[4816]: I0316 00:15:13.009358 4816 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/7c3e347f-464a-43f1-bf29-689bf81a28e6-audit-dir\") on node \"crc\" DevicePath \"\"" Mar 16 00:15:13 crc kubenswrapper[4816]: I0316 00:15:13.110834 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/26a979e8-1691-4390-82da-4229125eb297-audit-dir\") pod \"oauth-openshift-6d789fbbf-wkdfq\" (UID: \"26a979e8-1691-4390-82da-4229125eb297\") " pod="openshift-authentication/oauth-openshift-6d789fbbf-wkdfq" Mar 16 00:15:13 crc kubenswrapper[4816]: I0316 00:15:13.110872 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/26a979e8-1691-4390-82da-4229125eb297-v4-0-config-system-session\") pod \"oauth-openshift-6d789fbbf-wkdfq\" (UID: \"26a979e8-1691-4390-82da-4229125eb297\") " pod="openshift-authentication/oauth-openshift-6d789fbbf-wkdfq" Mar 16 00:15:13 crc kubenswrapper[4816]: I0316 00:15:13.110906 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/26a979e8-1691-4390-82da-4229125eb297-v4-0-config-user-template-error\") pod \"oauth-openshift-6d789fbbf-wkdfq\" (UID: \"26a979e8-1691-4390-82da-4229125eb297\") " pod="openshift-authentication/oauth-openshift-6d789fbbf-wkdfq" Mar 16 00:15:13 crc kubenswrapper[4816]: I0316 00:15:13.110987 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/26a979e8-1691-4390-82da-4229125eb297-v4-0-config-system-serving-cert\") pod \"oauth-openshift-6d789fbbf-wkdfq\" (UID: \"26a979e8-1691-4390-82da-4229125eb297\") " pod="openshift-authentication/oauth-openshift-6d789fbbf-wkdfq" Mar 16 00:15:13 crc kubenswrapper[4816]: I0316 00:15:13.111039 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/26a979e8-1691-4390-82da-4229125eb297-v4-0-config-system-service-ca\") pod \"oauth-openshift-6d789fbbf-wkdfq\" (UID: \"26a979e8-1691-4390-82da-4229125eb297\") " pod="openshift-authentication/oauth-openshift-6d789fbbf-wkdfq" Mar 16 00:15:13 crc kubenswrapper[4816]: I0316 00:15:13.111081 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/26a979e8-1691-4390-82da-4229125eb297-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-6d789fbbf-wkdfq\" (UID: \"26a979e8-1691-4390-82da-4229125eb297\") " pod="openshift-authentication/oauth-openshift-6d789fbbf-wkdfq" Mar 16 00:15:13 crc kubenswrapper[4816]: I0316 00:15:13.111110 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/26a979e8-1691-4390-82da-4229125eb297-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-6d789fbbf-wkdfq\" (UID: \"26a979e8-1691-4390-82da-4229125eb297\") " pod="openshift-authentication/oauth-openshift-6d789fbbf-wkdfq" Mar 16 00:15:13 crc kubenswrapper[4816]: I0316 00:15:13.111135 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/26a979e8-1691-4390-82da-4229125eb297-audit-policies\") pod \"oauth-openshift-6d789fbbf-wkdfq\" (UID: \"26a979e8-1691-4390-82da-4229125eb297\") " pod="openshift-authentication/oauth-openshift-6d789fbbf-wkdfq" Mar 16 00:15:13 crc kubenswrapper[4816]: I0316 00:15:13.111155 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-js2jb\" (UniqueName: \"kubernetes.io/projected/26a979e8-1691-4390-82da-4229125eb297-kube-api-access-js2jb\") pod \"oauth-openshift-6d789fbbf-wkdfq\" (UID: \"26a979e8-1691-4390-82da-4229125eb297\") " pod="openshift-authentication/oauth-openshift-6d789fbbf-wkdfq" Mar 16 00:15:13 crc kubenswrapper[4816]: I0316 00:15:13.111186 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/26a979e8-1691-4390-82da-4229125eb297-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-6d789fbbf-wkdfq\" (UID: \"26a979e8-1691-4390-82da-4229125eb297\") " pod="openshift-authentication/oauth-openshift-6d789fbbf-wkdfq" Mar 16 00:15:13 crc kubenswrapper[4816]: I0316 00:15:13.111234 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/26a979e8-1691-4390-82da-4229125eb297-v4-0-config-user-template-login\") pod \"oauth-openshift-6d789fbbf-wkdfq\" (UID: \"26a979e8-1691-4390-82da-4229125eb297\") " pod="openshift-authentication/oauth-openshift-6d789fbbf-wkdfq" Mar 16 00:15:13 crc kubenswrapper[4816]: I0316 00:15:13.111285 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/26a979e8-1691-4390-82da-4229125eb297-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-6d789fbbf-wkdfq\" (UID: \"26a979e8-1691-4390-82da-4229125eb297\") " pod="openshift-authentication/oauth-openshift-6d789fbbf-wkdfq" Mar 16 00:15:13 crc kubenswrapper[4816]: I0316 00:15:13.111316 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/26a979e8-1691-4390-82da-4229125eb297-v4-0-config-system-cliconfig\") pod \"oauth-openshift-6d789fbbf-wkdfq\" (UID: \"26a979e8-1691-4390-82da-4229125eb297\") " pod="openshift-authentication/oauth-openshift-6d789fbbf-wkdfq" Mar 16 00:15:13 crc kubenswrapper[4816]: I0316 00:15:13.111346 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/26a979e8-1691-4390-82da-4229125eb297-v4-0-config-system-router-certs\") pod \"oauth-openshift-6d789fbbf-wkdfq\" (UID: \"26a979e8-1691-4390-82da-4229125eb297\") " pod="openshift-authentication/oauth-openshift-6d789fbbf-wkdfq" Mar 16 00:15:13 crc kubenswrapper[4816]: I0316 00:15:13.112862 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/26a979e8-1691-4390-82da-4229125eb297-v4-0-config-system-cliconfig\") pod \"oauth-openshift-6d789fbbf-wkdfq\" (UID: \"26a979e8-1691-4390-82da-4229125eb297\") " pod="openshift-authentication/oauth-openshift-6d789fbbf-wkdfq" Mar 16 00:15:13 crc kubenswrapper[4816]: I0316 00:15:13.113110 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/26a979e8-1691-4390-82da-4229125eb297-audit-policies\") pod \"oauth-openshift-6d789fbbf-wkdfq\" (UID: \"26a979e8-1691-4390-82da-4229125eb297\") " pod="openshift-authentication/oauth-openshift-6d789fbbf-wkdfq" Mar 16 00:15:13 crc kubenswrapper[4816]: I0316 00:15:13.110931 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/26a979e8-1691-4390-82da-4229125eb297-audit-dir\") pod \"oauth-openshift-6d789fbbf-wkdfq\" (UID: \"26a979e8-1691-4390-82da-4229125eb297\") " pod="openshift-authentication/oauth-openshift-6d789fbbf-wkdfq" Mar 16 00:15:13 crc kubenswrapper[4816]: I0316 00:15:13.114066 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/26a979e8-1691-4390-82da-4229125eb297-v4-0-config-system-service-ca\") pod \"oauth-openshift-6d789fbbf-wkdfq\" (UID: \"26a979e8-1691-4390-82da-4229125eb297\") " pod="openshift-authentication/oauth-openshift-6d789fbbf-wkdfq" Mar 16 00:15:13 crc kubenswrapper[4816]: I0316 00:15:13.114913 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/26a979e8-1691-4390-82da-4229125eb297-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-6d789fbbf-wkdfq\" (UID: \"26a979e8-1691-4390-82da-4229125eb297\") " pod="openshift-authentication/oauth-openshift-6d789fbbf-wkdfq" Mar 16 00:15:13 crc kubenswrapper[4816]: I0316 00:15:13.115779 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/26a979e8-1691-4390-82da-4229125eb297-v4-0-config-user-template-error\") pod \"oauth-openshift-6d789fbbf-wkdfq\" (UID: \"26a979e8-1691-4390-82da-4229125eb297\") " pod="openshift-authentication/oauth-openshift-6d789fbbf-wkdfq" Mar 16 00:15:13 crc kubenswrapper[4816]: I0316 00:15:13.116461 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/26a979e8-1691-4390-82da-4229125eb297-v4-0-config-system-session\") pod \"oauth-openshift-6d789fbbf-wkdfq\" (UID: \"26a979e8-1691-4390-82da-4229125eb297\") " pod="openshift-authentication/oauth-openshift-6d789fbbf-wkdfq" Mar 16 00:15:13 crc kubenswrapper[4816]: I0316 00:15:13.117576 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/26a979e8-1691-4390-82da-4229125eb297-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-6d789fbbf-wkdfq\" (UID: \"26a979e8-1691-4390-82da-4229125eb297\") " pod="openshift-authentication/oauth-openshift-6d789fbbf-wkdfq" Mar 16 00:15:13 crc kubenswrapper[4816]: I0316 00:15:13.118067 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/26a979e8-1691-4390-82da-4229125eb297-v4-0-config-user-template-login\") pod \"oauth-openshift-6d789fbbf-wkdfq\" (UID: \"26a979e8-1691-4390-82da-4229125eb297\") " pod="openshift-authentication/oauth-openshift-6d789fbbf-wkdfq" Mar 16 00:15:13 crc kubenswrapper[4816]: I0316 00:15:13.118247 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/26a979e8-1691-4390-82da-4229125eb297-v4-0-config-system-router-certs\") pod \"oauth-openshift-6d789fbbf-wkdfq\" (UID: \"26a979e8-1691-4390-82da-4229125eb297\") " pod="openshift-authentication/oauth-openshift-6d789fbbf-wkdfq" Mar 16 00:15:13 crc kubenswrapper[4816]: I0316 00:15:13.119485 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/26a979e8-1691-4390-82da-4229125eb297-v4-0-config-system-serving-cert\") pod \"oauth-openshift-6d789fbbf-wkdfq\" (UID: \"26a979e8-1691-4390-82da-4229125eb297\") " pod="openshift-authentication/oauth-openshift-6d789fbbf-wkdfq" Mar 16 00:15:13 crc kubenswrapper[4816]: I0316 00:15:13.120130 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/26a979e8-1691-4390-82da-4229125eb297-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-6d789fbbf-wkdfq\" (UID: \"26a979e8-1691-4390-82da-4229125eb297\") " pod="openshift-authentication/oauth-openshift-6d789fbbf-wkdfq" Mar 16 00:15:13 crc kubenswrapper[4816]: I0316 00:15:13.124329 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/26a979e8-1691-4390-82da-4229125eb297-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-6d789fbbf-wkdfq\" (UID: \"26a979e8-1691-4390-82da-4229125eb297\") " pod="openshift-authentication/oauth-openshift-6d789fbbf-wkdfq" Mar 16 00:15:13 crc kubenswrapper[4816]: I0316 00:15:13.130449 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-js2jb\" (UniqueName: \"kubernetes.io/projected/26a979e8-1691-4390-82da-4229125eb297-kube-api-access-js2jb\") pod \"oauth-openshift-6d789fbbf-wkdfq\" (UID: \"26a979e8-1691-4390-82da-4229125eb297\") " pod="openshift-authentication/oauth-openshift-6d789fbbf-wkdfq" Mar 16 00:15:13 crc kubenswrapper[4816]: I0316 00:15:13.290075 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-6d789fbbf-wkdfq" Mar 16 00:15:13 crc kubenswrapper[4816]: I0316 00:15:13.383202 4816 generic.go:334] "Generic (PLEG): container finished" podID="7c3e347f-464a-43f1-bf29-689bf81a28e6" containerID="e323e811f301454593a76b4a27e968d571ea79e5909647b904b1cad07862ea62" exitCode=0 Mar 16 00:15:13 crc kubenswrapper[4816]: I0316 00:15:13.383258 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-sshl5" event={"ID":"7c3e347f-464a-43f1-bf29-689bf81a28e6","Type":"ContainerDied","Data":"e323e811f301454593a76b4a27e968d571ea79e5909647b904b1cad07862ea62"} Mar 16 00:15:13 crc kubenswrapper[4816]: I0316 00:15:13.383275 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-sshl5" Mar 16 00:15:13 crc kubenswrapper[4816]: I0316 00:15:13.383300 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-sshl5" event={"ID":"7c3e347f-464a-43f1-bf29-689bf81a28e6","Type":"ContainerDied","Data":"3631ced358fcea8ef22224f7b1a8e3a7674d52e4a7296b38cf119840b4577b45"} Mar 16 00:15:13 crc kubenswrapper[4816]: I0316 00:15:13.383324 4816 scope.go:117] "RemoveContainer" containerID="e323e811f301454593a76b4a27e968d571ea79e5909647b904b1cad07862ea62" Mar 16 00:15:13 crc kubenswrapper[4816]: I0316 00:15:13.412915 4816 scope.go:117] "RemoveContainer" containerID="e323e811f301454593a76b4a27e968d571ea79e5909647b904b1cad07862ea62" Mar 16 00:15:13 crc kubenswrapper[4816]: E0316 00:15:13.413785 4816 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e323e811f301454593a76b4a27e968d571ea79e5909647b904b1cad07862ea62\": container with ID starting with e323e811f301454593a76b4a27e968d571ea79e5909647b904b1cad07862ea62 not found: ID does not exist" containerID="e323e811f301454593a76b4a27e968d571ea79e5909647b904b1cad07862ea62" Mar 16 00:15:13 crc kubenswrapper[4816]: I0316 00:15:13.413829 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e323e811f301454593a76b4a27e968d571ea79e5909647b904b1cad07862ea62"} err="failed to get container status \"e323e811f301454593a76b4a27e968d571ea79e5909647b904b1cad07862ea62\": rpc error: code = NotFound desc = could not find container \"e323e811f301454593a76b4a27e968d571ea79e5909647b904b1cad07862ea62\": container with ID starting with e323e811f301454593a76b4a27e968d571ea79e5909647b904b1cad07862ea62 not found: ID does not exist" Mar 16 00:15:13 crc kubenswrapper[4816]: I0316 00:15:13.428387 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-sshl5"] Mar 16 00:15:13 crc kubenswrapper[4816]: I0316 00:15:13.433113 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-sshl5"] Mar 16 00:15:13 crc kubenswrapper[4816]: I0316 00:15:13.681420 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7c3e347f-464a-43f1-bf29-689bf81a28e6" path="/var/lib/kubelet/pods/7c3e347f-464a-43f1-bf29-689bf81a28e6/volumes" Mar 16 00:15:13 crc kubenswrapper[4816]: I0316 00:15:13.745937 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-6d789fbbf-wkdfq"] Mar 16 00:15:14 crc kubenswrapper[4816]: I0316 00:15:14.390134 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-6d789fbbf-wkdfq" event={"ID":"26a979e8-1691-4390-82da-4229125eb297","Type":"ContainerStarted","Data":"6f88b1b3cf345f29131fb6d7972de6f49cd3d8631240363894017f0427d4e311"} Mar 16 00:15:14 crc kubenswrapper[4816]: I0316 00:15:14.391619 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-6d789fbbf-wkdfq" event={"ID":"26a979e8-1691-4390-82da-4229125eb297","Type":"ContainerStarted","Data":"9017a83ec7c036dddf975c0a4343a653f66606edad95f40aaaa39233661e7659"} Mar 16 00:15:14 crc kubenswrapper[4816]: I0316 00:15:14.391725 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-6d789fbbf-wkdfq" Mar 16 00:15:14 crc kubenswrapper[4816]: I0316 00:15:14.413862 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-6d789fbbf-wkdfq" Mar 16 00:15:14 crc kubenswrapper[4816]: I0316 00:15:14.444274 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-6d789fbbf-wkdfq" podStartSLOduration=27.444244887 podStartE2EDuration="27.444244887s" podCreationTimestamp="2026-03-16 00:14:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-16 00:15:14.410114751 +0000 UTC m=+507.506414724" watchObservedRunningTime="2026-03-16 00:15:14.444244887 +0000 UTC m=+507.540544880" Mar 16 00:15:31 crc kubenswrapper[4816]: I0316 00:15:31.863008 4816 patch_prober.go:28] interesting pod/machine-config-daemon-jrdcz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 16 00:15:31 crc kubenswrapper[4816]: I0316 00:15:31.864614 4816 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jrdcz" podUID="dd08ece2-7636-4966-973a-e96a34b70b53" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 16 00:15:31 crc kubenswrapper[4816]: I0316 00:15:31.864753 4816 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-jrdcz" Mar 16 00:15:31 crc kubenswrapper[4816]: I0316 00:15:31.865258 4816 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"8214b8a7550606e587b215ee7c72e3638e054dd083cb6fa7b37990d33bec509b"} pod="openshift-machine-config-operator/machine-config-daemon-jrdcz" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 16 00:15:31 crc kubenswrapper[4816]: I0316 00:15:31.865387 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-jrdcz" podUID="dd08ece2-7636-4966-973a-e96a34b70b53" containerName="machine-config-daemon" containerID="cri-o://8214b8a7550606e587b215ee7c72e3638e054dd083cb6fa7b37990d33bec509b" gracePeriod=600 Mar 16 00:15:32 crc kubenswrapper[4816]: I0316 00:15:32.513976 4816 generic.go:334] "Generic (PLEG): container finished" podID="dd08ece2-7636-4966-973a-e96a34b70b53" containerID="8214b8a7550606e587b215ee7c72e3638e054dd083cb6fa7b37990d33bec509b" exitCode=0 Mar 16 00:15:32 crc kubenswrapper[4816]: I0316 00:15:32.514063 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-jrdcz" event={"ID":"dd08ece2-7636-4966-973a-e96a34b70b53","Type":"ContainerDied","Data":"8214b8a7550606e587b215ee7c72e3638e054dd083cb6fa7b37990d33bec509b"} Mar 16 00:15:32 crc kubenswrapper[4816]: I0316 00:15:32.514382 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-jrdcz" event={"ID":"dd08ece2-7636-4966-973a-e96a34b70b53","Type":"ContainerStarted","Data":"054dcd9294a0533063364a3ea7e009e513fea0236f1afad37201a02a85a0eee4"} Mar 16 00:15:32 crc kubenswrapper[4816]: I0316 00:15:32.514407 4816 scope.go:117] "RemoveContainer" containerID="7003a4592a48f87ab63e9f5637c7665040f9784a7c8f599c82ed61270ddb793c" Mar 16 00:15:52 crc kubenswrapper[4816]: I0316 00:15:52.531520 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-wh2h7"] Mar 16 00:15:52 crc kubenswrapper[4816]: I0316 00:15:52.532406 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-wh2h7" podUID="b1b3efd0-cdc0-4973-8077-bcd1ea567bdd" containerName="registry-server" containerID="cri-o://0362950976a76988474476b81bd7730cbe780ac154e5a2f2044e4e909795351d" gracePeriod=30 Mar 16 00:15:52 crc kubenswrapper[4816]: I0316 00:15:52.551441 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-4gwcw"] Mar 16 00:15:52 crc kubenswrapper[4816]: I0316 00:15:52.551797 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-4gwcw" podUID="ad80e1a9-75dc-4860-9bd9-d59b0c0ae43c" containerName="registry-server" containerID="cri-o://1907f6f84400f8d2fe767c5f795be3bd07851337cba8cf48da1973d87467affc" gracePeriod=30 Mar 16 00:15:52 crc kubenswrapper[4816]: I0316 00:15:52.571788 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-8226q"] Mar 16 00:15:52 crc kubenswrapper[4816]: I0316 00:15:52.572230 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/marketplace-operator-79b997595-8226q" podUID="02854230-6165-4f22-8780-d8591b991132" containerName="marketplace-operator" containerID="cri-o://6a102196a4ace87bc37cea4d4ac25a7e1b7077cd996f17f48fd3ea1d9ccb059e" gracePeriod=30 Mar 16 00:15:52 crc kubenswrapper[4816]: I0316 00:15:52.577520 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-7pb49"] Mar 16 00:15:52 crc kubenswrapper[4816]: I0316 00:15:52.577846 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-7pb49" podUID="a5ba22dd-8e8e-4beb-a540-e5c9687810b8" containerName="registry-server" containerID="cri-o://625b9a78bb0b854582527b25363acb2b99ec915fd9386d255757cbfa80fd76bf" gracePeriod=30 Mar 16 00:15:52 crc kubenswrapper[4816]: I0316 00:15:52.597177 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-52qs6"] Mar 16 00:15:52 crc kubenswrapper[4816]: I0316 00:15:52.598967 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-52qs6" podUID="6ca6c2c9-3a12-4eb3-9df1-7fdea640791d" containerName="registry-server" containerID="cri-o://f63537cc995d8e268cbd368b29b4cc5be951232e62b58e296828738dd881f0b2" gracePeriod=30 Mar 16 00:15:52 crc kubenswrapper[4816]: I0316 00:15:52.601712 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-8ln7g"] Mar 16 00:15:52 crc kubenswrapper[4816]: I0316 00:15:52.602450 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-8ln7g" Mar 16 00:15:52 crc kubenswrapper[4816]: I0316 00:15:52.616327 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-8ln7g"] Mar 16 00:15:52 crc kubenswrapper[4816]: E0316 00:15:52.666362 4816 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 1907f6f84400f8d2fe767c5f795be3bd07851337cba8cf48da1973d87467affc is running failed: container process not found" containerID="1907f6f84400f8d2fe767c5f795be3bd07851337cba8cf48da1973d87467affc" cmd=["grpc_health_probe","-addr=:50051"] Mar 16 00:15:52 crc kubenswrapper[4816]: E0316 00:15:52.666706 4816 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 1907f6f84400f8d2fe767c5f795be3bd07851337cba8cf48da1973d87467affc is running failed: container process not found" containerID="1907f6f84400f8d2fe767c5f795be3bd07851337cba8cf48da1973d87467affc" cmd=["grpc_health_probe","-addr=:50051"] Mar 16 00:15:52 crc kubenswrapper[4816]: E0316 00:15:52.666978 4816 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 1907f6f84400f8d2fe767c5f795be3bd07851337cba8cf48da1973d87467affc is running failed: container process not found" containerID="1907f6f84400f8d2fe767c5f795be3bd07851337cba8cf48da1973d87467affc" cmd=["grpc_health_probe","-addr=:50051"] Mar 16 00:15:52 crc kubenswrapper[4816]: E0316 00:15:52.667040 4816 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 1907f6f84400f8d2fe767c5f795be3bd07851337cba8cf48da1973d87467affc is running failed: container process not found" probeType="Readiness" pod="openshift-marketplace/community-operators-4gwcw" podUID="ad80e1a9-75dc-4860-9bd9-d59b0c0ae43c" containerName="registry-server" Mar 16 00:15:52 crc kubenswrapper[4816]: I0316 00:15:52.686682 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6d197f63-0b7c-496d-89bb-9cd70933969a-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-8ln7g\" (UID: \"6d197f63-0b7c-496d-89bb-9cd70933969a\") " pod="openshift-marketplace/marketplace-operator-79b997595-8ln7g" Mar 16 00:15:52 crc kubenswrapper[4816]: I0316 00:15:52.686739 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/6d197f63-0b7c-496d-89bb-9cd70933969a-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-8ln7g\" (UID: \"6d197f63-0b7c-496d-89bb-9cd70933969a\") " pod="openshift-marketplace/marketplace-operator-79b997595-8ln7g" Mar 16 00:15:52 crc kubenswrapper[4816]: I0316 00:15:52.686769 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7twhq\" (UniqueName: \"kubernetes.io/projected/6d197f63-0b7c-496d-89bb-9cd70933969a-kube-api-access-7twhq\") pod \"marketplace-operator-79b997595-8ln7g\" (UID: \"6d197f63-0b7c-496d-89bb-9cd70933969a\") " pod="openshift-marketplace/marketplace-operator-79b997595-8ln7g" Mar 16 00:15:52 crc kubenswrapper[4816]: E0316 00:15:52.694232 4816 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod02854230_6165_4f22_8780_d8591b991132.slice/crio-6a102196a4ace87bc37cea4d4ac25a7e1b7077cd996f17f48fd3ea1d9ccb059e.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb1b3efd0_cdc0_4973_8077_bcd1ea567bdd.slice/crio-0362950976a76988474476b81bd7730cbe780ac154e5a2f2044e4e909795351d.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podad80e1a9_75dc_4860_9bd9_d59b0c0ae43c.slice/crio-1907f6f84400f8d2fe767c5f795be3bd07851337cba8cf48da1973d87467affc.scope\": RecentStats: unable to find data in memory cache]" Mar 16 00:15:52 crc kubenswrapper[4816]: I0316 00:15:52.796352 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6d197f63-0b7c-496d-89bb-9cd70933969a-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-8ln7g\" (UID: \"6d197f63-0b7c-496d-89bb-9cd70933969a\") " pod="openshift-marketplace/marketplace-operator-79b997595-8ln7g" Mar 16 00:15:52 crc kubenswrapper[4816]: I0316 00:15:52.796426 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/6d197f63-0b7c-496d-89bb-9cd70933969a-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-8ln7g\" (UID: \"6d197f63-0b7c-496d-89bb-9cd70933969a\") " pod="openshift-marketplace/marketplace-operator-79b997595-8ln7g" Mar 16 00:15:52 crc kubenswrapper[4816]: I0316 00:15:52.796457 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7twhq\" (UniqueName: \"kubernetes.io/projected/6d197f63-0b7c-496d-89bb-9cd70933969a-kube-api-access-7twhq\") pod \"marketplace-operator-79b997595-8ln7g\" (UID: \"6d197f63-0b7c-496d-89bb-9cd70933969a\") " pod="openshift-marketplace/marketplace-operator-79b997595-8ln7g" Mar 16 00:15:52 crc kubenswrapper[4816]: I0316 00:15:52.798621 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6d197f63-0b7c-496d-89bb-9cd70933969a-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-8ln7g\" (UID: \"6d197f63-0b7c-496d-89bb-9cd70933969a\") " pod="openshift-marketplace/marketplace-operator-79b997595-8ln7g" Mar 16 00:15:52 crc kubenswrapper[4816]: I0316 00:15:52.807298 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/6d197f63-0b7c-496d-89bb-9cd70933969a-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-8ln7g\" (UID: \"6d197f63-0b7c-496d-89bb-9cd70933969a\") " pod="openshift-marketplace/marketplace-operator-79b997595-8ln7g" Mar 16 00:15:52 crc kubenswrapper[4816]: I0316 00:15:52.820369 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7twhq\" (UniqueName: \"kubernetes.io/projected/6d197f63-0b7c-496d-89bb-9cd70933969a-kube-api-access-7twhq\") pod \"marketplace-operator-79b997595-8ln7g\" (UID: \"6d197f63-0b7c-496d-89bb-9cd70933969a\") " pod="openshift-marketplace/marketplace-operator-79b997595-8ln7g" Mar 16 00:15:52 crc kubenswrapper[4816]: I0316 00:15:52.937738 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-8ln7g" Mar 16 00:15:53 crc kubenswrapper[4816]: I0316 00:15:53.019457 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-wh2h7" Mar 16 00:15:53 crc kubenswrapper[4816]: I0316 00:15:53.107378 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b1b3efd0-cdc0-4973-8077-bcd1ea567bdd-utilities\") pod \"b1b3efd0-cdc0-4973-8077-bcd1ea567bdd\" (UID: \"b1b3efd0-cdc0-4973-8077-bcd1ea567bdd\") " Mar 16 00:15:53 crc kubenswrapper[4816]: I0316 00:15:53.107430 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w4mjs\" (UniqueName: \"kubernetes.io/projected/b1b3efd0-cdc0-4973-8077-bcd1ea567bdd-kube-api-access-w4mjs\") pod \"b1b3efd0-cdc0-4973-8077-bcd1ea567bdd\" (UID: \"b1b3efd0-cdc0-4973-8077-bcd1ea567bdd\") " Mar 16 00:15:53 crc kubenswrapper[4816]: I0316 00:15:53.107516 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b1b3efd0-cdc0-4973-8077-bcd1ea567bdd-catalog-content\") pod \"b1b3efd0-cdc0-4973-8077-bcd1ea567bdd\" (UID: \"b1b3efd0-cdc0-4973-8077-bcd1ea567bdd\") " Mar 16 00:15:53 crc kubenswrapper[4816]: I0316 00:15:53.108809 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b1b3efd0-cdc0-4973-8077-bcd1ea567bdd-utilities" (OuterVolumeSpecName: "utilities") pod "b1b3efd0-cdc0-4973-8077-bcd1ea567bdd" (UID: "b1b3efd0-cdc0-4973-8077-bcd1ea567bdd"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 16 00:15:53 crc kubenswrapper[4816]: I0316 00:15:53.120872 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b1b3efd0-cdc0-4973-8077-bcd1ea567bdd-kube-api-access-w4mjs" (OuterVolumeSpecName: "kube-api-access-w4mjs") pod "b1b3efd0-cdc0-4973-8077-bcd1ea567bdd" (UID: "b1b3efd0-cdc0-4973-8077-bcd1ea567bdd"). InnerVolumeSpecName "kube-api-access-w4mjs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:15:53 crc kubenswrapper[4816]: I0316 00:15:53.132478 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-8226q" Mar 16 00:15:53 crc kubenswrapper[4816]: I0316 00:15:53.144406 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-52qs6" Mar 16 00:15:53 crc kubenswrapper[4816]: I0316 00:15:53.149301 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-4gwcw" Mar 16 00:15:53 crc kubenswrapper[4816]: I0316 00:15:53.164584 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b1b3efd0-cdc0-4973-8077-bcd1ea567bdd-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b1b3efd0-cdc0-4973-8077-bcd1ea567bdd" (UID: "b1b3efd0-cdc0-4973-8077-bcd1ea567bdd"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 16 00:15:53 crc kubenswrapper[4816]: I0316 00:15:53.181385 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-7pb49" Mar 16 00:15:53 crc kubenswrapper[4816]: I0316 00:15:53.208153 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mrpff\" (UniqueName: \"kubernetes.io/projected/6ca6c2c9-3a12-4eb3-9df1-7fdea640791d-kube-api-access-mrpff\") pod \"6ca6c2c9-3a12-4eb3-9df1-7fdea640791d\" (UID: \"6ca6c2c9-3a12-4eb3-9df1-7fdea640791d\") " Mar 16 00:15:53 crc kubenswrapper[4816]: I0316 00:15:53.208263 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6ca6c2c9-3a12-4eb3-9df1-7fdea640791d-catalog-content\") pod \"6ca6c2c9-3a12-4eb3-9df1-7fdea640791d\" (UID: \"6ca6c2c9-3a12-4eb3-9df1-7fdea640791d\") " Mar 16 00:15:53 crc kubenswrapper[4816]: I0316 00:15:53.208311 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/02854230-6165-4f22-8780-d8591b991132-marketplace-trusted-ca\") pod \"02854230-6165-4f22-8780-d8591b991132\" (UID: \"02854230-6165-4f22-8780-d8591b991132\") " Mar 16 00:15:53 crc kubenswrapper[4816]: I0316 00:15:53.208373 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a5ba22dd-8e8e-4beb-a540-e5c9687810b8-catalog-content\") pod \"a5ba22dd-8e8e-4beb-a540-e5c9687810b8\" (UID: \"a5ba22dd-8e8e-4beb-a540-e5c9687810b8\") " Mar 16 00:15:53 crc kubenswrapper[4816]: I0316 00:15:53.209093 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/02854230-6165-4f22-8780-d8591b991132-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "02854230-6165-4f22-8780-d8591b991132" (UID: "02854230-6165-4f22-8780-d8591b991132"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:15:53 crc kubenswrapper[4816]: I0316 00:15:53.209164 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ad80e1a9-75dc-4860-9bd9-d59b0c0ae43c-catalog-content\") pod \"ad80e1a9-75dc-4860-9bd9-d59b0c0ae43c\" (UID: \"ad80e1a9-75dc-4860-9bd9-d59b0c0ae43c\") " Mar 16 00:15:53 crc kubenswrapper[4816]: I0316 00:15:53.209198 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j24xj\" (UniqueName: \"kubernetes.io/projected/a5ba22dd-8e8e-4beb-a540-e5c9687810b8-kube-api-access-j24xj\") pod \"a5ba22dd-8e8e-4beb-a540-e5c9687810b8\" (UID: \"a5ba22dd-8e8e-4beb-a540-e5c9687810b8\") " Mar 16 00:15:53 crc kubenswrapper[4816]: I0316 00:15:53.209237 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/02854230-6165-4f22-8780-d8591b991132-marketplace-operator-metrics\") pod \"02854230-6165-4f22-8780-d8591b991132\" (UID: \"02854230-6165-4f22-8780-d8591b991132\") " Mar 16 00:15:53 crc kubenswrapper[4816]: I0316 00:15:53.209268 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zgh99\" (UniqueName: \"kubernetes.io/projected/02854230-6165-4f22-8780-d8591b991132-kube-api-access-zgh99\") pod \"02854230-6165-4f22-8780-d8591b991132\" (UID: \"02854230-6165-4f22-8780-d8591b991132\") " Mar 16 00:15:53 crc kubenswrapper[4816]: I0316 00:15:53.209290 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-45bbd\" (UniqueName: \"kubernetes.io/projected/ad80e1a9-75dc-4860-9bd9-d59b0c0ae43c-kube-api-access-45bbd\") pod \"ad80e1a9-75dc-4860-9bd9-d59b0c0ae43c\" (UID: \"ad80e1a9-75dc-4860-9bd9-d59b0c0ae43c\") " Mar 16 00:15:53 crc kubenswrapper[4816]: I0316 00:15:53.209318 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6ca6c2c9-3a12-4eb3-9df1-7fdea640791d-utilities\") pod \"6ca6c2c9-3a12-4eb3-9df1-7fdea640791d\" (UID: \"6ca6c2c9-3a12-4eb3-9df1-7fdea640791d\") " Mar 16 00:15:53 crc kubenswrapper[4816]: I0316 00:15:53.209334 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ad80e1a9-75dc-4860-9bd9-d59b0c0ae43c-utilities\") pod \"ad80e1a9-75dc-4860-9bd9-d59b0c0ae43c\" (UID: \"ad80e1a9-75dc-4860-9bd9-d59b0c0ae43c\") " Mar 16 00:15:53 crc kubenswrapper[4816]: I0316 00:15:53.209355 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a5ba22dd-8e8e-4beb-a540-e5c9687810b8-utilities\") pod \"a5ba22dd-8e8e-4beb-a540-e5c9687810b8\" (UID: \"a5ba22dd-8e8e-4beb-a540-e5c9687810b8\") " Mar 16 00:15:53 crc kubenswrapper[4816]: I0316 00:15:53.209753 4816 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b1b3efd0-cdc0-4973-8077-bcd1ea567bdd-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 16 00:15:53 crc kubenswrapper[4816]: I0316 00:15:53.209772 4816 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/02854230-6165-4f22-8780-d8591b991132-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 16 00:15:53 crc kubenswrapper[4816]: I0316 00:15:53.209786 4816 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b1b3efd0-cdc0-4973-8077-bcd1ea567bdd-utilities\") on node \"crc\" DevicePath \"\"" Mar 16 00:15:53 crc kubenswrapper[4816]: I0316 00:15:53.209795 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w4mjs\" (UniqueName: \"kubernetes.io/projected/b1b3efd0-cdc0-4973-8077-bcd1ea567bdd-kube-api-access-w4mjs\") on node \"crc\" DevicePath \"\"" Mar 16 00:15:53 crc kubenswrapper[4816]: I0316 00:15:53.210754 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a5ba22dd-8e8e-4beb-a540-e5c9687810b8-utilities" (OuterVolumeSpecName: "utilities") pod "a5ba22dd-8e8e-4beb-a540-e5c9687810b8" (UID: "a5ba22dd-8e8e-4beb-a540-e5c9687810b8"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 16 00:15:53 crc kubenswrapper[4816]: I0316 00:15:53.212214 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6ca6c2c9-3a12-4eb3-9df1-7fdea640791d-utilities" (OuterVolumeSpecName: "utilities") pod "6ca6c2c9-3a12-4eb3-9df1-7fdea640791d" (UID: "6ca6c2c9-3a12-4eb3-9df1-7fdea640791d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 16 00:15:53 crc kubenswrapper[4816]: I0316 00:15:53.213061 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ad80e1a9-75dc-4860-9bd9-d59b0c0ae43c-utilities" (OuterVolumeSpecName: "utilities") pod "ad80e1a9-75dc-4860-9bd9-d59b0c0ae43c" (UID: "ad80e1a9-75dc-4860-9bd9-d59b0c0ae43c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 16 00:15:53 crc kubenswrapper[4816]: I0316 00:15:53.214407 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a5ba22dd-8e8e-4beb-a540-e5c9687810b8-kube-api-access-j24xj" (OuterVolumeSpecName: "kube-api-access-j24xj") pod "a5ba22dd-8e8e-4beb-a540-e5c9687810b8" (UID: "a5ba22dd-8e8e-4beb-a540-e5c9687810b8"). InnerVolumeSpecName "kube-api-access-j24xj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:15:53 crc kubenswrapper[4816]: I0316 00:15:53.214511 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ca6c2c9-3a12-4eb3-9df1-7fdea640791d-kube-api-access-mrpff" (OuterVolumeSpecName: "kube-api-access-mrpff") pod "6ca6c2c9-3a12-4eb3-9df1-7fdea640791d" (UID: "6ca6c2c9-3a12-4eb3-9df1-7fdea640791d"). InnerVolumeSpecName "kube-api-access-mrpff". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:15:53 crc kubenswrapper[4816]: I0316 00:15:53.214620 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/02854230-6165-4f22-8780-d8591b991132-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "02854230-6165-4f22-8780-d8591b991132" (UID: "02854230-6165-4f22-8780-d8591b991132"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 16 00:15:53 crc kubenswrapper[4816]: I0316 00:15:53.214730 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/02854230-6165-4f22-8780-d8591b991132-kube-api-access-zgh99" (OuterVolumeSpecName: "kube-api-access-zgh99") pod "02854230-6165-4f22-8780-d8591b991132" (UID: "02854230-6165-4f22-8780-d8591b991132"). InnerVolumeSpecName "kube-api-access-zgh99". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:15:53 crc kubenswrapper[4816]: I0316 00:15:53.222236 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ad80e1a9-75dc-4860-9bd9-d59b0c0ae43c-kube-api-access-45bbd" (OuterVolumeSpecName: "kube-api-access-45bbd") pod "ad80e1a9-75dc-4860-9bd9-d59b0c0ae43c" (UID: "ad80e1a9-75dc-4860-9bd9-d59b0c0ae43c"). InnerVolumeSpecName "kube-api-access-45bbd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:15:53 crc kubenswrapper[4816]: I0316 00:15:53.238095 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a5ba22dd-8e8e-4beb-a540-e5c9687810b8-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a5ba22dd-8e8e-4beb-a540-e5c9687810b8" (UID: "a5ba22dd-8e8e-4beb-a540-e5c9687810b8"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 16 00:15:53 crc kubenswrapper[4816]: I0316 00:15:53.266119 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ad80e1a9-75dc-4860-9bd9-d59b0c0ae43c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ad80e1a9-75dc-4860-9bd9-d59b0c0ae43c" (UID: "ad80e1a9-75dc-4860-9bd9-d59b0c0ae43c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 16 00:15:53 crc kubenswrapper[4816]: I0316 00:15:53.311273 4816 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/02854230-6165-4f22-8780-d8591b991132-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Mar 16 00:15:53 crc kubenswrapper[4816]: I0316 00:15:53.311328 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zgh99\" (UniqueName: \"kubernetes.io/projected/02854230-6165-4f22-8780-d8591b991132-kube-api-access-zgh99\") on node \"crc\" DevicePath \"\"" Mar 16 00:15:53 crc kubenswrapper[4816]: I0316 00:15:53.311341 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-45bbd\" (UniqueName: \"kubernetes.io/projected/ad80e1a9-75dc-4860-9bd9-d59b0c0ae43c-kube-api-access-45bbd\") on node \"crc\" DevicePath \"\"" Mar 16 00:15:53 crc kubenswrapper[4816]: I0316 00:15:53.311356 4816 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6ca6c2c9-3a12-4eb3-9df1-7fdea640791d-utilities\") on node \"crc\" DevicePath \"\"" Mar 16 00:15:53 crc kubenswrapper[4816]: I0316 00:15:53.311371 4816 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ad80e1a9-75dc-4860-9bd9-d59b0c0ae43c-utilities\") on node \"crc\" DevicePath \"\"" Mar 16 00:15:53 crc kubenswrapper[4816]: I0316 00:15:53.311382 4816 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a5ba22dd-8e8e-4beb-a540-e5c9687810b8-utilities\") on node \"crc\" DevicePath \"\"" Mar 16 00:15:53 crc kubenswrapper[4816]: I0316 00:15:53.311393 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mrpff\" (UniqueName: \"kubernetes.io/projected/6ca6c2c9-3a12-4eb3-9df1-7fdea640791d-kube-api-access-mrpff\") on node \"crc\" DevicePath \"\"" Mar 16 00:15:53 crc kubenswrapper[4816]: I0316 00:15:53.311404 4816 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a5ba22dd-8e8e-4beb-a540-e5c9687810b8-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 16 00:15:53 crc kubenswrapper[4816]: I0316 00:15:53.311413 4816 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ad80e1a9-75dc-4860-9bd9-d59b0c0ae43c-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 16 00:15:53 crc kubenswrapper[4816]: I0316 00:15:53.311424 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j24xj\" (UniqueName: \"kubernetes.io/projected/a5ba22dd-8e8e-4beb-a540-e5c9687810b8-kube-api-access-j24xj\") on node \"crc\" DevicePath \"\"" Mar 16 00:15:53 crc kubenswrapper[4816]: I0316 00:15:53.337414 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6ca6c2c9-3a12-4eb3-9df1-7fdea640791d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "6ca6c2c9-3a12-4eb3-9df1-7fdea640791d" (UID: "6ca6c2c9-3a12-4eb3-9df1-7fdea640791d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 16 00:15:53 crc kubenswrapper[4816]: I0316 00:15:53.412879 4816 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6ca6c2c9-3a12-4eb3-9df1-7fdea640791d-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 16 00:15:53 crc kubenswrapper[4816]: I0316 00:15:53.472356 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-8ln7g"] Mar 16 00:15:53 crc kubenswrapper[4816]: I0316 00:15:53.645084 4816 generic.go:334] "Generic (PLEG): container finished" podID="6ca6c2c9-3a12-4eb3-9df1-7fdea640791d" containerID="f63537cc995d8e268cbd368b29b4cc5be951232e62b58e296828738dd881f0b2" exitCode=0 Mar 16 00:15:53 crc kubenswrapper[4816]: I0316 00:15:53.645141 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-52qs6" event={"ID":"6ca6c2c9-3a12-4eb3-9df1-7fdea640791d","Type":"ContainerDied","Data":"f63537cc995d8e268cbd368b29b4cc5be951232e62b58e296828738dd881f0b2"} Mar 16 00:15:53 crc kubenswrapper[4816]: I0316 00:15:53.645169 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-52qs6" event={"ID":"6ca6c2c9-3a12-4eb3-9df1-7fdea640791d","Type":"ContainerDied","Data":"47689d47c5b861a3bd4357a2faba7a8ab87d56775475b31d461c37bf8423f524"} Mar 16 00:15:53 crc kubenswrapper[4816]: I0316 00:15:53.645190 4816 scope.go:117] "RemoveContainer" containerID="f63537cc995d8e268cbd368b29b4cc5be951232e62b58e296828738dd881f0b2" Mar 16 00:15:53 crc kubenswrapper[4816]: I0316 00:15:53.645299 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-52qs6" Mar 16 00:15:53 crc kubenswrapper[4816]: I0316 00:15:53.654715 4816 generic.go:334] "Generic (PLEG): container finished" podID="a5ba22dd-8e8e-4beb-a540-e5c9687810b8" containerID="625b9a78bb0b854582527b25363acb2b99ec915fd9386d255757cbfa80fd76bf" exitCode=0 Mar 16 00:15:53 crc kubenswrapper[4816]: I0316 00:15:53.654780 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7pb49" event={"ID":"a5ba22dd-8e8e-4beb-a540-e5c9687810b8","Type":"ContainerDied","Data":"625b9a78bb0b854582527b25363acb2b99ec915fd9386d255757cbfa80fd76bf"} Mar 16 00:15:53 crc kubenswrapper[4816]: I0316 00:15:53.654807 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7pb49" event={"ID":"a5ba22dd-8e8e-4beb-a540-e5c9687810b8","Type":"ContainerDied","Data":"7718a309c71ba8a48a463087b2e901f51d954ea050a7be786e3c0a847d6a54eb"} Mar 16 00:15:53 crc kubenswrapper[4816]: I0316 00:15:53.654868 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-7pb49" Mar 16 00:15:53 crc kubenswrapper[4816]: I0316 00:15:53.659687 4816 generic.go:334] "Generic (PLEG): container finished" podID="ad80e1a9-75dc-4860-9bd9-d59b0c0ae43c" containerID="1907f6f84400f8d2fe767c5f795be3bd07851337cba8cf48da1973d87467affc" exitCode=0 Mar 16 00:15:53 crc kubenswrapper[4816]: I0316 00:15:53.659756 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4gwcw" event={"ID":"ad80e1a9-75dc-4860-9bd9-d59b0c0ae43c","Type":"ContainerDied","Data":"1907f6f84400f8d2fe767c5f795be3bd07851337cba8cf48da1973d87467affc"} Mar 16 00:15:53 crc kubenswrapper[4816]: I0316 00:15:53.659788 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4gwcw" event={"ID":"ad80e1a9-75dc-4860-9bd9-d59b0c0ae43c","Type":"ContainerDied","Data":"661437598c338aed0d5a7d52e67330434003899adaefd998268791f6175ab8ca"} Mar 16 00:15:53 crc kubenswrapper[4816]: I0316 00:15:53.659861 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-4gwcw" Mar 16 00:15:53 crc kubenswrapper[4816]: I0316 00:15:53.672345 4816 generic.go:334] "Generic (PLEG): container finished" podID="02854230-6165-4f22-8780-d8591b991132" containerID="6a102196a4ace87bc37cea4d4ac25a7e1b7077cd996f17f48fd3ea1d9ccb059e" exitCode=0 Mar 16 00:15:53 crc kubenswrapper[4816]: I0316 00:15:53.672478 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-8226q" Mar 16 00:15:53 crc kubenswrapper[4816]: I0316 00:15:53.676711 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-8226q" event={"ID":"02854230-6165-4f22-8780-d8591b991132","Type":"ContainerDied","Data":"6a102196a4ace87bc37cea4d4ac25a7e1b7077cd996f17f48fd3ea1d9ccb059e"} Mar 16 00:15:53 crc kubenswrapper[4816]: I0316 00:15:53.676754 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-8226q" event={"ID":"02854230-6165-4f22-8780-d8591b991132","Type":"ContainerDied","Data":"fbc545a6e69e36c7e153d8947909848cfdb5be666c80ed949869b9fabb25d45a"} Mar 16 00:15:53 crc kubenswrapper[4816]: I0316 00:15:53.680976 4816 generic.go:334] "Generic (PLEG): container finished" podID="b1b3efd0-cdc0-4973-8077-bcd1ea567bdd" containerID="0362950976a76988474476b81bd7730cbe780ac154e5a2f2044e4e909795351d" exitCode=0 Mar 16 00:15:53 crc kubenswrapper[4816]: I0316 00:15:53.681092 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wh2h7" event={"ID":"b1b3efd0-cdc0-4973-8077-bcd1ea567bdd","Type":"ContainerDied","Data":"0362950976a76988474476b81bd7730cbe780ac154e5a2f2044e4e909795351d"} Mar 16 00:15:53 crc kubenswrapper[4816]: I0316 00:15:53.681118 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wh2h7" event={"ID":"b1b3efd0-cdc0-4973-8077-bcd1ea567bdd","Type":"ContainerDied","Data":"0e89bdbfb4ed11608191b3360966bdeb2f13767d41154d3097545518437bcaec"} Mar 16 00:15:53 crc kubenswrapper[4816]: I0316 00:15:53.681198 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-wh2h7" Mar 16 00:15:53 crc kubenswrapper[4816]: I0316 00:15:53.686149 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-8ln7g" event={"ID":"6d197f63-0b7c-496d-89bb-9cd70933969a","Type":"ContainerStarted","Data":"cd3d797954a516d4cc00dfe574bc7782894689d4edd8ec9d9626045825209edb"} Mar 16 00:15:53 crc kubenswrapper[4816]: I0316 00:15:53.686192 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-8ln7g" event={"ID":"6d197f63-0b7c-496d-89bb-9cd70933969a","Type":"ContainerStarted","Data":"e8b07ddb778279b84b392d5ba788140e7de4b1ade3517d804c46c4a859a16c55"} Mar 16 00:15:53 crc kubenswrapper[4816]: I0316 00:15:53.687828 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-8ln7g" Mar 16 00:15:53 crc kubenswrapper[4816]: I0316 00:15:53.689187 4816 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-8ln7g container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.76:8080/healthz\": dial tcp 10.217.0.76:8080: connect: connection refused" start-of-body= Mar 16 00:15:53 crc kubenswrapper[4816]: I0316 00:15:53.689245 4816 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-8ln7g" podUID="6d197f63-0b7c-496d-89bb-9cd70933969a" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.76:8080/healthz\": dial tcp 10.217.0.76:8080: connect: connection refused" Mar 16 00:15:53 crc kubenswrapper[4816]: I0316 00:15:53.702423 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-8ln7g" podStartSLOduration=1.702402886 podStartE2EDuration="1.702402886s" podCreationTimestamp="2026-03-16 00:15:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-16 00:15:53.699012477 +0000 UTC m=+546.795312440" watchObservedRunningTime="2026-03-16 00:15:53.702402886 +0000 UTC m=+546.798702839" Mar 16 00:15:53 crc kubenswrapper[4816]: I0316 00:15:53.709695 4816 scope.go:117] "RemoveContainer" containerID="43853e41b6150efe99d4e5270bddef069e33c9677ee5b6b76a29e50bb9d6dc72" Mar 16 00:15:53 crc kubenswrapper[4816]: I0316 00:15:53.741671 4816 scope.go:117] "RemoveContainer" containerID="056d2f945def262f75a277cf64e3b3f1d5e6532d0a76f4258d544edfbbff2503" Mar 16 00:15:53 crc kubenswrapper[4816]: I0316 00:15:53.763167 4816 scope.go:117] "RemoveContainer" containerID="f63537cc995d8e268cbd368b29b4cc5be951232e62b58e296828738dd881f0b2" Mar 16 00:15:53 crc kubenswrapper[4816]: E0316 00:15:53.765380 4816 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f63537cc995d8e268cbd368b29b4cc5be951232e62b58e296828738dd881f0b2\": container with ID starting with f63537cc995d8e268cbd368b29b4cc5be951232e62b58e296828738dd881f0b2 not found: ID does not exist" containerID="f63537cc995d8e268cbd368b29b4cc5be951232e62b58e296828738dd881f0b2" Mar 16 00:15:53 crc kubenswrapper[4816]: I0316 00:15:53.765413 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f63537cc995d8e268cbd368b29b4cc5be951232e62b58e296828738dd881f0b2"} err="failed to get container status \"f63537cc995d8e268cbd368b29b4cc5be951232e62b58e296828738dd881f0b2\": rpc error: code = NotFound desc = could not find container \"f63537cc995d8e268cbd368b29b4cc5be951232e62b58e296828738dd881f0b2\": container with ID starting with f63537cc995d8e268cbd368b29b4cc5be951232e62b58e296828738dd881f0b2 not found: ID does not exist" Mar 16 00:15:53 crc kubenswrapper[4816]: I0316 00:15:53.765436 4816 scope.go:117] "RemoveContainer" containerID="43853e41b6150efe99d4e5270bddef069e33c9677ee5b6b76a29e50bb9d6dc72" Mar 16 00:15:53 crc kubenswrapper[4816]: E0316 00:15:53.765789 4816 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"43853e41b6150efe99d4e5270bddef069e33c9677ee5b6b76a29e50bb9d6dc72\": container with ID starting with 43853e41b6150efe99d4e5270bddef069e33c9677ee5b6b76a29e50bb9d6dc72 not found: ID does not exist" containerID="43853e41b6150efe99d4e5270bddef069e33c9677ee5b6b76a29e50bb9d6dc72" Mar 16 00:15:53 crc kubenswrapper[4816]: I0316 00:15:53.765811 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"43853e41b6150efe99d4e5270bddef069e33c9677ee5b6b76a29e50bb9d6dc72"} err="failed to get container status \"43853e41b6150efe99d4e5270bddef069e33c9677ee5b6b76a29e50bb9d6dc72\": rpc error: code = NotFound desc = could not find container \"43853e41b6150efe99d4e5270bddef069e33c9677ee5b6b76a29e50bb9d6dc72\": container with ID starting with 43853e41b6150efe99d4e5270bddef069e33c9677ee5b6b76a29e50bb9d6dc72 not found: ID does not exist" Mar 16 00:15:53 crc kubenswrapper[4816]: I0316 00:15:53.765823 4816 scope.go:117] "RemoveContainer" containerID="056d2f945def262f75a277cf64e3b3f1d5e6532d0a76f4258d544edfbbff2503" Mar 16 00:15:53 crc kubenswrapper[4816]: E0316 00:15:53.766179 4816 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"056d2f945def262f75a277cf64e3b3f1d5e6532d0a76f4258d544edfbbff2503\": container with ID starting with 056d2f945def262f75a277cf64e3b3f1d5e6532d0a76f4258d544edfbbff2503 not found: ID does not exist" containerID="056d2f945def262f75a277cf64e3b3f1d5e6532d0a76f4258d544edfbbff2503" Mar 16 00:15:53 crc kubenswrapper[4816]: I0316 00:15:53.766206 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"056d2f945def262f75a277cf64e3b3f1d5e6532d0a76f4258d544edfbbff2503"} err="failed to get container status \"056d2f945def262f75a277cf64e3b3f1d5e6532d0a76f4258d544edfbbff2503\": rpc error: code = NotFound desc = could not find container \"056d2f945def262f75a277cf64e3b3f1d5e6532d0a76f4258d544edfbbff2503\": container with ID starting with 056d2f945def262f75a277cf64e3b3f1d5e6532d0a76f4258d544edfbbff2503 not found: ID does not exist" Mar 16 00:15:53 crc kubenswrapper[4816]: I0316 00:15:53.766218 4816 scope.go:117] "RemoveContainer" containerID="625b9a78bb0b854582527b25363acb2b99ec915fd9386d255757cbfa80fd76bf" Mar 16 00:15:53 crc kubenswrapper[4816]: I0316 00:15:53.770117 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-4gwcw"] Mar 16 00:15:53 crc kubenswrapper[4816]: I0316 00:15:53.774507 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-4gwcw"] Mar 16 00:15:53 crc kubenswrapper[4816]: I0316 00:15:53.778857 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-52qs6"] Mar 16 00:15:53 crc kubenswrapper[4816]: I0316 00:15:53.782268 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-52qs6"] Mar 16 00:15:53 crc kubenswrapper[4816]: I0316 00:15:53.791570 4816 scope.go:117] "RemoveContainer" containerID="908485a9ff25d0805bbaf35b08443a41a047722c7887492c6b9436d8c8c3aabc" Mar 16 00:15:53 crc kubenswrapper[4816]: I0316 00:15:53.795107 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-8226q"] Mar 16 00:15:53 crc kubenswrapper[4816]: I0316 00:15:53.799525 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-8226q"] Mar 16 00:15:53 crc kubenswrapper[4816]: I0316 00:15:53.810151 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-7pb49"] Mar 16 00:15:53 crc kubenswrapper[4816]: I0316 00:15:53.818063 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-7pb49"] Mar 16 00:15:53 crc kubenswrapper[4816]: I0316 00:15:53.820050 4816 scope.go:117] "RemoveContainer" containerID="f01a941d4255e96b1cacdf7b072faf6dd7d1c330c59ed7ec0d337e5fb9c8854f" Mar 16 00:15:53 crc kubenswrapper[4816]: I0316 00:15:53.821644 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-wh2h7"] Mar 16 00:15:53 crc kubenswrapper[4816]: I0316 00:15:53.825445 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-wh2h7"] Mar 16 00:15:53 crc kubenswrapper[4816]: I0316 00:15:53.832858 4816 scope.go:117] "RemoveContainer" containerID="625b9a78bb0b854582527b25363acb2b99ec915fd9386d255757cbfa80fd76bf" Mar 16 00:15:53 crc kubenswrapper[4816]: E0316 00:15:53.833165 4816 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"625b9a78bb0b854582527b25363acb2b99ec915fd9386d255757cbfa80fd76bf\": container with ID starting with 625b9a78bb0b854582527b25363acb2b99ec915fd9386d255757cbfa80fd76bf not found: ID does not exist" containerID="625b9a78bb0b854582527b25363acb2b99ec915fd9386d255757cbfa80fd76bf" Mar 16 00:15:53 crc kubenswrapper[4816]: I0316 00:15:53.833195 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"625b9a78bb0b854582527b25363acb2b99ec915fd9386d255757cbfa80fd76bf"} err="failed to get container status \"625b9a78bb0b854582527b25363acb2b99ec915fd9386d255757cbfa80fd76bf\": rpc error: code = NotFound desc = could not find container \"625b9a78bb0b854582527b25363acb2b99ec915fd9386d255757cbfa80fd76bf\": container with ID starting with 625b9a78bb0b854582527b25363acb2b99ec915fd9386d255757cbfa80fd76bf not found: ID does not exist" Mar 16 00:15:53 crc kubenswrapper[4816]: I0316 00:15:53.833216 4816 scope.go:117] "RemoveContainer" containerID="908485a9ff25d0805bbaf35b08443a41a047722c7887492c6b9436d8c8c3aabc" Mar 16 00:15:53 crc kubenswrapper[4816]: E0316 00:15:53.833541 4816 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"908485a9ff25d0805bbaf35b08443a41a047722c7887492c6b9436d8c8c3aabc\": container with ID starting with 908485a9ff25d0805bbaf35b08443a41a047722c7887492c6b9436d8c8c3aabc not found: ID does not exist" containerID="908485a9ff25d0805bbaf35b08443a41a047722c7887492c6b9436d8c8c3aabc" Mar 16 00:15:53 crc kubenswrapper[4816]: I0316 00:15:53.833576 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"908485a9ff25d0805bbaf35b08443a41a047722c7887492c6b9436d8c8c3aabc"} err="failed to get container status \"908485a9ff25d0805bbaf35b08443a41a047722c7887492c6b9436d8c8c3aabc\": rpc error: code = NotFound desc = could not find container \"908485a9ff25d0805bbaf35b08443a41a047722c7887492c6b9436d8c8c3aabc\": container with ID starting with 908485a9ff25d0805bbaf35b08443a41a047722c7887492c6b9436d8c8c3aabc not found: ID does not exist" Mar 16 00:15:53 crc kubenswrapper[4816]: I0316 00:15:53.833589 4816 scope.go:117] "RemoveContainer" containerID="f01a941d4255e96b1cacdf7b072faf6dd7d1c330c59ed7ec0d337e5fb9c8854f" Mar 16 00:15:53 crc kubenswrapper[4816]: E0316 00:15:53.833933 4816 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f01a941d4255e96b1cacdf7b072faf6dd7d1c330c59ed7ec0d337e5fb9c8854f\": container with ID starting with f01a941d4255e96b1cacdf7b072faf6dd7d1c330c59ed7ec0d337e5fb9c8854f not found: ID does not exist" containerID="f01a941d4255e96b1cacdf7b072faf6dd7d1c330c59ed7ec0d337e5fb9c8854f" Mar 16 00:15:53 crc kubenswrapper[4816]: I0316 00:15:53.833983 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f01a941d4255e96b1cacdf7b072faf6dd7d1c330c59ed7ec0d337e5fb9c8854f"} err="failed to get container status \"f01a941d4255e96b1cacdf7b072faf6dd7d1c330c59ed7ec0d337e5fb9c8854f\": rpc error: code = NotFound desc = could not find container \"f01a941d4255e96b1cacdf7b072faf6dd7d1c330c59ed7ec0d337e5fb9c8854f\": container with ID starting with f01a941d4255e96b1cacdf7b072faf6dd7d1c330c59ed7ec0d337e5fb9c8854f not found: ID does not exist" Mar 16 00:15:53 crc kubenswrapper[4816]: I0316 00:15:53.834018 4816 scope.go:117] "RemoveContainer" containerID="1907f6f84400f8d2fe767c5f795be3bd07851337cba8cf48da1973d87467affc" Mar 16 00:15:53 crc kubenswrapper[4816]: I0316 00:15:53.847205 4816 scope.go:117] "RemoveContainer" containerID="67c2ff8f6f6f445d4fd7c00ab7519136e896d1421d5929d43ac69cca289fa317" Mar 16 00:15:53 crc kubenswrapper[4816]: I0316 00:15:53.859988 4816 scope.go:117] "RemoveContainer" containerID="16a79ba542cf5cd202f64693662a992a86e39b69458f390ed8cb6f7dbabffd89" Mar 16 00:15:53 crc kubenswrapper[4816]: I0316 00:15:53.874106 4816 scope.go:117] "RemoveContainer" containerID="1907f6f84400f8d2fe767c5f795be3bd07851337cba8cf48da1973d87467affc" Mar 16 00:15:53 crc kubenswrapper[4816]: E0316 00:15:53.874492 4816 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1907f6f84400f8d2fe767c5f795be3bd07851337cba8cf48da1973d87467affc\": container with ID starting with 1907f6f84400f8d2fe767c5f795be3bd07851337cba8cf48da1973d87467affc not found: ID does not exist" containerID="1907f6f84400f8d2fe767c5f795be3bd07851337cba8cf48da1973d87467affc" Mar 16 00:15:53 crc kubenswrapper[4816]: I0316 00:15:53.874521 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1907f6f84400f8d2fe767c5f795be3bd07851337cba8cf48da1973d87467affc"} err="failed to get container status \"1907f6f84400f8d2fe767c5f795be3bd07851337cba8cf48da1973d87467affc\": rpc error: code = NotFound desc = could not find container \"1907f6f84400f8d2fe767c5f795be3bd07851337cba8cf48da1973d87467affc\": container with ID starting with 1907f6f84400f8d2fe767c5f795be3bd07851337cba8cf48da1973d87467affc not found: ID does not exist" Mar 16 00:15:53 crc kubenswrapper[4816]: I0316 00:15:53.874563 4816 scope.go:117] "RemoveContainer" containerID="67c2ff8f6f6f445d4fd7c00ab7519136e896d1421d5929d43ac69cca289fa317" Mar 16 00:15:53 crc kubenswrapper[4816]: E0316 00:15:53.875047 4816 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"67c2ff8f6f6f445d4fd7c00ab7519136e896d1421d5929d43ac69cca289fa317\": container with ID starting with 67c2ff8f6f6f445d4fd7c00ab7519136e896d1421d5929d43ac69cca289fa317 not found: ID does not exist" containerID="67c2ff8f6f6f445d4fd7c00ab7519136e896d1421d5929d43ac69cca289fa317" Mar 16 00:15:53 crc kubenswrapper[4816]: I0316 00:15:53.875067 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"67c2ff8f6f6f445d4fd7c00ab7519136e896d1421d5929d43ac69cca289fa317"} err="failed to get container status \"67c2ff8f6f6f445d4fd7c00ab7519136e896d1421d5929d43ac69cca289fa317\": rpc error: code = NotFound desc = could not find container \"67c2ff8f6f6f445d4fd7c00ab7519136e896d1421d5929d43ac69cca289fa317\": container with ID starting with 67c2ff8f6f6f445d4fd7c00ab7519136e896d1421d5929d43ac69cca289fa317 not found: ID does not exist" Mar 16 00:15:53 crc kubenswrapper[4816]: I0316 00:15:53.875079 4816 scope.go:117] "RemoveContainer" containerID="16a79ba542cf5cd202f64693662a992a86e39b69458f390ed8cb6f7dbabffd89" Mar 16 00:15:53 crc kubenswrapper[4816]: E0316 00:15:53.875274 4816 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"16a79ba542cf5cd202f64693662a992a86e39b69458f390ed8cb6f7dbabffd89\": container with ID starting with 16a79ba542cf5cd202f64693662a992a86e39b69458f390ed8cb6f7dbabffd89 not found: ID does not exist" containerID="16a79ba542cf5cd202f64693662a992a86e39b69458f390ed8cb6f7dbabffd89" Mar 16 00:15:53 crc kubenswrapper[4816]: I0316 00:15:53.875294 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"16a79ba542cf5cd202f64693662a992a86e39b69458f390ed8cb6f7dbabffd89"} err="failed to get container status \"16a79ba542cf5cd202f64693662a992a86e39b69458f390ed8cb6f7dbabffd89\": rpc error: code = NotFound desc = could not find container \"16a79ba542cf5cd202f64693662a992a86e39b69458f390ed8cb6f7dbabffd89\": container with ID starting with 16a79ba542cf5cd202f64693662a992a86e39b69458f390ed8cb6f7dbabffd89 not found: ID does not exist" Mar 16 00:15:53 crc kubenswrapper[4816]: I0316 00:15:53.875307 4816 scope.go:117] "RemoveContainer" containerID="6a102196a4ace87bc37cea4d4ac25a7e1b7077cd996f17f48fd3ea1d9ccb059e" Mar 16 00:15:53 crc kubenswrapper[4816]: I0316 00:15:53.893291 4816 scope.go:117] "RemoveContainer" containerID="6a102196a4ace87bc37cea4d4ac25a7e1b7077cd996f17f48fd3ea1d9ccb059e" Mar 16 00:15:53 crc kubenswrapper[4816]: E0316 00:15:53.894781 4816 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6a102196a4ace87bc37cea4d4ac25a7e1b7077cd996f17f48fd3ea1d9ccb059e\": container with ID starting with 6a102196a4ace87bc37cea4d4ac25a7e1b7077cd996f17f48fd3ea1d9ccb059e not found: ID does not exist" containerID="6a102196a4ace87bc37cea4d4ac25a7e1b7077cd996f17f48fd3ea1d9ccb059e" Mar 16 00:15:53 crc kubenswrapper[4816]: I0316 00:15:53.894809 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6a102196a4ace87bc37cea4d4ac25a7e1b7077cd996f17f48fd3ea1d9ccb059e"} err="failed to get container status \"6a102196a4ace87bc37cea4d4ac25a7e1b7077cd996f17f48fd3ea1d9ccb059e\": rpc error: code = NotFound desc = could not find container \"6a102196a4ace87bc37cea4d4ac25a7e1b7077cd996f17f48fd3ea1d9ccb059e\": container with ID starting with 6a102196a4ace87bc37cea4d4ac25a7e1b7077cd996f17f48fd3ea1d9ccb059e not found: ID does not exist" Mar 16 00:15:53 crc kubenswrapper[4816]: I0316 00:15:53.894855 4816 scope.go:117] "RemoveContainer" containerID="0362950976a76988474476b81bd7730cbe780ac154e5a2f2044e4e909795351d" Mar 16 00:15:53 crc kubenswrapper[4816]: I0316 00:15:53.909308 4816 scope.go:117] "RemoveContainer" containerID="c3b7cea0d43489debd23962525e8aaaf02e629e9ef59a654ba5aa278317285e1" Mar 16 00:15:53 crc kubenswrapper[4816]: I0316 00:15:53.925903 4816 scope.go:117] "RemoveContainer" containerID="c898ff116eae4b6d21df6664b26fd09b1827c5eec1cd2e0032f6fcd35a691638" Mar 16 00:15:53 crc kubenswrapper[4816]: I0316 00:15:53.939929 4816 scope.go:117] "RemoveContainer" containerID="0362950976a76988474476b81bd7730cbe780ac154e5a2f2044e4e909795351d" Mar 16 00:15:53 crc kubenswrapper[4816]: E0316 00:15:53.940421 4816 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0362950976a76988474476b81bd7730cbe780ac154e5a2f2044e4e909795351d\": container with ID starting with 0362950976a76988474476b81bd7730cbe780ac154e5a2f2044e4e909795351d not found: ID does not exist" containerID="0362950976a76988474476b81bd7730cbe780ac154e5a2f2044e4e909795351d" Mar 16 00:15:53 crc kubenswrapper[4816]: I0316 00:15:53.940451 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0362950976a76988474476b81bd7730cbe780ac154e5a2f2044e4e909795351d"} err="failed to get container status \"0362950976a76988474476b81bd7730cbe780ac154e5a2f2044e4e909795351d\": rpc error: code = NotFound desc = could not find container \"0362950976a76988474476b81bd7730cbe780ac154e5a2f2044e4e909795351d\": container with ID starting with 0362950976a76988474476b81bd7730cbe780ac154e5a2f2044e4e909795351d not found: ID does not exist" Mar 16 00:15:53 crc kubenswrapper[4816]: I0316 00:15:53.940471 4816 scope.go:117] "RemoveContainer" containerID="c3b7cea0d43489debd23962525e8aaaf02e629e9ef59a654ba5aa278317285e1" Mar 16 00:15:53 crc kubenswrapper[4816]: E0316 00:15:53.940947 4816 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c3b7cea0d43489debd23962525e8aaaf02e629e9ef59a654ba5aa278317285e1\": container with ID starting with c3b7cea0d43489debd23962525e8aaaf02e629e9ef59a654ba5aa278317285e1 not found: ID does not exist" containerID="c3b7cea0d43489debd23962525e8aaaf02e629e9ef59a654ba5aa278317285e1" Mar 16 00:15:53 crc kubenswrapper[4816]: I0316 00:15:53.940970 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c3b7cea0d43489debd23962525e8aaaf02e629e9ef59a654ba5aa278317285e1"} err="failed to get container status \"c3b7cea0d43489debd23962525e8aaaf02e629e9ef59a654ba5aa278317285e1\": rpc error: code = NotFound desc = could not find container \"c3b7cea0d43489debd23962525e8aaaf02e629e9ef59a654ba5aa278317285e1\": container with ID starting with c3b7cea0d43489debd23962525e8aaaf02e629e9ef59a654ba5aa278317285e1 not found: ID does not exist" Mar 16 00:15:53 crc kubenswrapper[4816]: I0316 00:15:53.940986 4816 scope.go:117] "RemoveContainer" containerID="c898ff116eae4b6d21df6664b26fd09b1827c5eec1cd2e0032f6fcd35a691638" Mar 16 00:15:53 crc kubenswrapper[4816]: E0316 00:15:53.941189 4816 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c898ff116eae4b6d21df6664b26fd09b1827c5eec1cd2e0032f6fcd35a691638\": container with ID starting with c898ff116eae4b6d21df6664b26fd09b1827c5eec1cd2e0032f6fcd35a691638 not found: ID does not exist" containerID="c898ff116eae4b6d21df6664b26fd09b1827c5eec1cd2e0032f6fcd35a691638" Mar 16 00:15:53 crc kubenswrapper[4816]: I0316 00:15:53.941210 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c898ff116eae4b6d21df6664b26fd09b1827c5eec1cd2e0032f6fcd35a691638"} err="failed to get container status \"c898ff116eae4b6d21df6664b26fd09b1827c5eec1cd2e0032f6fcd35a691638\": rpc error: code = NotFound desc = could not find container \"c898ff116eae4b6d21df6664b26fd09b1827c5eec1cd2e0032f6fcd35a691638\": container with ID starting with c898ff116eae4b6d21df6664b26fd09b1827c5eec1cd2e0032f6fcd35a691638 not found: ID does not exist" Mar 16 00:15:54 crc kubenswrapper[4816]: I0316 00:15:54.698099 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-8ln7g" Mar 16 00:15:55 crc kubenswrapper[4816]: I0316 00:15:55.545859 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-nvgvc"] Mar 16 00:15:55 crc kubenswrapper[4816]: E0316 00:15:55.546105 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b1b3efd0-cdc0-4973-8077-bcd1ea567bdd" containerName="registry-server" Mar 16 00:15:55 crc kubenswrapper[4816]: I0316 00:15:55.546120 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="b1b3efd0-cdc0-4973-8077-bcd1ea567bdd" containerName="registry-server" Mar 16 00:15:55 crc kubenswrapper[4816]: E0316 00:15:55.546136 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b1b3efd0-cdc0-4973-8077-bcd1ea567bdd" containerName="extract-content" Mar 16 00:15:55 crc kubenswrapper[4816]: I0316 00:15:55.546144 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="b1b3efd0-cdc0-4973-8077-bcd1ea567bdd" containerName="extract-content" Mar 16 00:15:55 crc kubenswrapper[4816]: E0316 00:15:55.546156 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6ca6c2c9-3a12-4eb3-9df1-7fdea640791d" containerName="extract-content" Mar 16 00:15:55 crc kubenswrapper[4816]: I0316 00:15:55.546164 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="6ca6c2c9-3a12-4eb3-9df1-7fdea640791d" containerName="extract-content" Mar 16 00:15:55 crc kubenswrapper[4816]: E0316 00:15:55.546175 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ad80e1a9-75dc-4860-9bd9-d59b0c0ae43c" containerName="registry-server" Mar 16 00:15:55 crc kubenswrapper[4816]: I0316 00:15:55.546183 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="ad80e1a9-75dc-4860-9bd9-d59b0c0ae43c" containerName="registry-server" Mar 16 00:15:55 crc kubenswrapper[4816]: E0316 00:15:55.546196 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ad80e1a9-75dc-4860-9bd9-d59b0c0ae43c" containerName="extract-utilities" Mar 16 00:15:55 crc kubenswrapper[4816]: I0316 00:15:55.546203 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="ad80e1a9-75dc-4860-9bd9-d59b0c0ae43c" containerName="extract-utilities" Mar 16 00:15:55 crc kubenswrapper[4816]: E0316 00:15:55.546210 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6ca6c2c9-3a12-4eb3-9df1-7fdea640791d" containerName="extract-utilities" Mar 16 00:15:55 crc kubenswrapper[4816]: I0316 00:15:55.546218 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="6ca6c2c9-3a12-4eb3-9df1-7fdea640791d" containerName="extract-utilities" Mar 16 00:15:55 crc kubenswrapper[4816]: E0316 00:15:55.546229 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ad80e1a9-75dc-4860-9bd9-d59b0c0ae43c" containerName="extract-content" Mar 16 00:15:55 crc kubenswrapper[4816]: I0316 00:15:55.546236 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="ad80e1a9-75dc-4860-9bd9-d59b0c0ae43c" containerName="extract-content" Mar 16 00:15:55 crc kubenswrapper[4816]: E0316 00:15:55.546246 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a5ba22dd-8e8e-4beb-a540-e5c9687810b8" containerName="extract-utilities" Mar 16 00:15:55 crc kubenswrapper[4816]: I0316 00:15:55.546255 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="a5ba22dd-8e8e-4beb-a540-e5c9687810b8" containerName="extract-utilities" Mar 16 00:15:55 crc kubenswrapper[4816]: E0316 00:15:55.546263 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a5ba22dd-8e8e-4beb-a540-e5c9687810b8" containerName="registry-server" Mar 16 00:15:55 crc kubenswrapper[4816]: I0316 00:15:55.546272 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="a5ba22dd-8e8e-4beb-a540-e5c9687810b8" containerName="registry-server" Mar 16 00:15:55 crc kubenswrapper[4816]: E0316 00:15:55.546283 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a5ba22dd-8e8e-4beb-a540-e5c9687810b8" containerName="extract-content" Mar 16 00:15:55 crc kubenswrapper[4816]: I0316 00:15:55.546290 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="a5ba22dd-8e8e-4beb-a540-e5c9687810b8" containerName="extract-content" Mar 16 00:15:55 crc kubenswrapper[4816]: E0316 00:15:55.546303 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6ca6c2c9-3a12-4eb3-9df1-7fdea640791d" containerName="registry-server" Mar 16 00:15:55 crc kubenswrapper[4816]: I0316 00:15:55.546310 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="6ca6c2c9-3a12-4eb3-9df1-7fdea640791d" containerName="registry-server" Mar 16 00:15:55 crc kubenswrapper[4816]: E0316 00:15:55.546318 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b1b3efd0-cdc0-4973-8077-bcd1ea567bdd" containerName="extract-utilities" Mar 16 00:15:55 crc kubenswrapper[4816]: I0316 00:15:55.546328 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="b1b3efd0-cdc0-4973-8077-bcd1ea567bdd" containerName="extract-utilities" Mar 16 00:15:55 crc kubenswrapper[4816]: E0316 00:15:55.546339 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="02854230-6165-4f22-8780-d8591b991132" containerName="marketplace-operator" Mar 16 00:15:55 crc kubenswrapper[4816]: I0316 00:15:55.546347 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="02854230-6165-4f22-8780-d8591b991132" containerName="marketplace-operator" Mar 16 00:15:55 crc kubenswrapper[4816]: I0316 00:15:55.546456 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="02854230-6165-4f22-8780-d8591b991132" containerName="marketplace-operator" Mar 16 00:15:55 crc kubenswrapper[4816]: I0316 00:15:55.546467 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="ad80e1a9-75dc-4860-9bd9-d59b0c0ae43c" containerName="registry-server" Mar 16 00:15:55 crc kubenswrapper[4816]: I0316 00:15:55.546477 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="b1b3efd0-cdc0-4973-8077-bcd1ea567bdd" containerName="registry-server" Mar 16 00:15:55 crc kubenswrapper[4816]: I0316 00:15:55.546491 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="a5ba22dd-8e8e-4beb-a540-e5c9687810b8" containerName="registry-server" Mar 16 00:15:55 crc kubenswrapper[4816]: I0316 00:15:55.546498 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="6ca6c2c9-3a12-4eb3-9df1-7fdea640791d" containerName="registry-server" Mar 16 00:15:55 crc kubenswrapper[4816]: I0316 00:15:55.547318 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-nvgvc" Mar 16 00:15:55 crc kubenswrapper[4816]: I0316 00:15:55.552294 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Mar 16 00:15:55 crc kubenswrapper[4816]: I0316 00:15:55.558920 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-nvgvc"] Mar 16 00:15:55 crc kubenswrapper[4816]: I0316 00:15:55.643972 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qh6xk\" (UniqueName: \"kubernetes.io/projected/63fdf2e4-cd4d-40ad-ae9e-60642f5f47fa-kube-api-access-qh6xk\") pod \"redhat-marketplace-nvgvc\" (UID: \"63fdf2e4-cd4d-40ad-ae9e-60642f5f47fa\") " pod="openshift-marketplace/redhat-marketplace-nvgvc" Mar 16 00:15:55 crc kubenswrapper[4816]: I0316 00:15:55.644021 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/63fdf2e4-cd4d-40ad-ae9e-60642f5f47fa-utilities\") pod \"redhat-marketplace-nvgvc\" (UID: \"63fdf2e4-cd4d-40ad-ae9e-60642f5f47fa\") " pod="openshift-marketplace/redhat-marketplace-nvgvc" Mar 16 00:15:55 crc kubenswrapper[4816]: I0316 00:15:55.644234 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/63fdf2e4-cd4d-40ad-ae9e-60642f5f47fa-catalog-content\") pod \"redhat-marketplace-nvgvc\" (UID: \"63fdf2e4-cd4d-40ad-ae9e-60642f5f47fa\") " pod="openshift-marketplace/redhat-marketplace-nvgvc" Mar 16 00:15:55 crc kubenswrapper[4816]: I0316 00:15:55.674090 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="02854230-6165-4f22-8780-d8591b991132" path="/var/lib/kubelet/pods/02854230-6165-4f22-8780-d8591b991132/volumes" Mar 16 00:15:55 crc kubenswrapper[4816]: I0316 00:15:55.674653 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ca6c2c9-3a12-4eb3-9df1-7fdea640791d" path="/var/lib/kubelet/pods/6ca6c2c9-3a12-4eb3-9df1-7fdea640791d/volumes" Mar 16 00:15:55 crc kubenswrapper[4816]: I0316 00:15:55.675194 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a5ba22dd-8e8e-4beb-a540-e5c9687810b8" path="/var/lib/kubelet/pods/a5ba22dd-8e8e-4beb-a540-e5c9687810b8/volumes" Mar 16 00:15:55 crc kubenswrapper[4816]: I0316 00:15:55.676182 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ad80e1a9-75dc-4860-9bd9-d59b0c0ae43c" path="/var/lib/kubelet/pods/ad80e1a9-75dc-4860-9bd9-d59b0c0ae43c/volumes" Mar 16 00:15:55 crc kubenswrapper[4816]: I0316 00:15:55.676755 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b1b3efd0-cdc0-4973-8077-bcd1ea567bdd" path="/var/lib/kubelet/pods/b1b3efd0-cdc0-4973-8077-bcd1ea567bdd/volumes" Mar 16 00:15:55 crc kubenswrapper[4816]: I0316 00:15:55.745092 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/63fdf2e4-cd4d-40ad-ae9e-60642f5f47fa-catalog-content\") pod \"redhat-marketplace-nvgvc\" (UID: \"63fdf2e4-cd4d-40ad-ae9e-60642f5f47fa\") " pod="openshift-marketplace/redhat-marketplace-nvgvc" Mar 16 00:15:55 crc kubenswrapper[4816]: I0316 00:15:55.745203 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qh6xk\" (UniqueName: \"kubernetes.io/projected/63fdf2e4-cd4d-40ad-ae9e-60642f5f47fa-kube-api-access-qh6xk\") pod \"redhat-marketplace-nvgvc\" (UID: \"63fdf2e4-cd4d-40ad-ae9e-60642f5f47fa\") " pod="openshift-marketplace/redhat-marketplace-nvgvc" Mar 16 00:15:55 crc kubenswrapper[4816]: I0316 00:15:55.745223 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/63fdf2e4-cd4d-40ad-ae9e-60642f5f47fa-utilities\") pod \"redhat-marketplace-nvgvc\" (UID: \"63fdf2e4-cd4d-40ad-ae9e-60642f5f47fa\") " pod="openshift-marketplace/redhat-marketplace-nvgvc" Mar 16 00:15:55 crc kubenswrapper[4816]: I0316 00:15:55.745728 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/63fdf2e4-cd4d-40ad-ae9e-60642f5f47fa-utilities\") pod \"redhat-marketplace-nvgvc\" (UID: \"63fdf2e4-cd4d-40ad-ae9e-60642f5f47fa\") " pod="openshift-marketplace/redhat-marketplace-nvgvc" Mar 16 00:15:55 crc kubenswrapper[4816]: I0316 00:15:55.746208 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/63fdf2e4-cd4d-40ad-ae9e-60642f5f47fa-catalog-content\") pod \"redhat-marketplace-nvgvc\" (UID: \"63fdf2e4-cd4d-40ad-ae9e-60642f5f47fa\") " pod="openshift-marketplace/redhat-marketplace-nvgvc" Mar 16 00:15:55 crc kubenswrapper[4816]: I0316 00:15:55.769958 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qh6xk\" (UniqueName: \"kubernetes.io/projected/63fdf2e4-cd4d-40ad-ae9e-60642f5f47fa-kube-api-access-qh6xk\") pod \"redhat-marketplace-nvgvc\" (UID: \"63fdf2e4-cd4d-40ad-ae9e-60642f5f47fa\") " pod="openshift-marketplace/redhat-marketplace-nvgvc" Mar 16 00:15:55 crc kubenswrapper[4816]: I0316 00:15:55.864933 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-nvgvc" Mar 16 00:15:56 crc kubenswrapper[4816]: I0316 00:15:56.147335 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-hmzx7"] Mar 16 00:15:56 crc kubenswrapper[4816]: I0316 00:15:56.148892 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-hmzx7" Mar 16 00:15:56 crc kubenswrapper[4816]: I0316 00:15:56.151967 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Mar 16 00:15:56 crc kubenswrapper[4816]: I0316 00:15:56.163797 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-hmzx7"] Mar 16 00:15:56 crc kubenswrapper[4816]: I0316 00:15:56.250882 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w2xct\" (UniqueName: \"kubernetes.io/projected/6df1dc3a-6abd-4ffc-b27b-e66f281ed273-kube-api-access-w2xct\") pod \"redhat-operators-hmzx7\" (UID: \"6df1dc3a-6abd-4ffc-b27b-e66f281ed273\") " pod="openshift-marketplace/redhat-operators-hmzx7" Mar 16 00:15:56 crc kubenswrapper[4816]: I0316 00:15:56.250945 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6df1dc3a-6abd-4ffc-b27b-e66f281ed273-utilities\") pod \"redhat-operators-hmzx7\" (UID: \"6df1dc3a-6abd-4ffc-b27b-e66f281ed273\") " pod="openshift-marketplace/redhat-operators-hmzx7" Mar 16 00:15:56 crc kubenswrapper[4816]: I0316 00:15:56.250998 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6df1dc3a-6abd-4ffc-b27b-e66f281ed273-catalog-content\") pod \"redhat-operators-hmzx7\" (UID: \"6df1dc3a-6abd-4ffc-b27b-e66f281ed273\") " pod="openshift-marketplace/redhat-operators-hmzx7" Mar 16 00:15:56 crc kubenswrapper[4816]: I0316 00:15:56.254597 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-nvgvc"] Mar 16 00:15:56 crc kubenswrapper[4816]: I0316 00:15:56.352284 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6df1dc3a-6abd-4ffc-b27b-e66f281ed273-catalog-content\") pod \"redhat-operators-hmzx7\" (UID: \"6df1dc3a-6abd-4ffc-b27b-e66f281ed273\") " pod="openshift-marketplace/redhat-operators-hmzx7" Mar 16 00:15:56 crc kubenswrapper[4816]: I0316 00:15:56.352409 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w2xct\" (UniqueName: \"kubernetes.io/projected/6df1dc3a-6abd-4ffc-b27b-e66f281ed273-kube-api-access-w2xct\") pod \"redhat-operators-hmzx7\" (UID: \"6df1dc3a-6abd-4ffc-b27b-e66f281ed273\") " pod="openshift-marketplace/redhat-operators-hmzx7" Mar 16 00:15:56 crc kubenswrapper[4816]: I0316 00:15:56.352454 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6df1dc3a-6abd-4ffc-b27b-e66f281ed273-utilities\") pod \"redhat-operators-hmzx7\" (UID: \"6df1dc3a-6abd-4ffc-b27b-e66f281ed273\") " pod="openshift-marketplace/redhat-operators-hmzx7" Mar 16 00:15:56 crc kubenswrapper[4816]: I0316 00:15:56.352878 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6df1dc3a-6abd-4ffc-b27b-e66f281ed273-catalog-content\") pod \"redhat-operators-hmzx7\" (UID: \"6df1dc3a-6abd-4ffc-b27b-e66f281ed273\") " pod="openshift-marketplace/redhat-operators-hmzx7" Mar 16 00:15:56 crc kubenswrapper[4816]: I0316 00:15:56.353012 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6df1dc3a-6abd-4ffc-b27b-e66f281ed273-utilities\") pod \"redhat-operators-hmzx7\" (UID: \"6df1dc3a-6abd-4ffc-b27b-e66f281ed273\") " pod="openshift-marketplace/redhat-operators-hmzx7" Mar 16 00:15:56 crc kubenswrapper[4816]: I0316 00:15:56.372332 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w2xct\" (UniqueName: \"kubernetes.io/projected/6df1dc3a-6abd-4ffc-b27b-e66f281ed273-kube-api-access-w2xct\") pod \"redhat-operators-hmzx7\" (UID: \"6df1dc3a-6abd-4ffc-b27b-e66f281ed273\") " pod="openshift-marketplace/redhat-operators-hmzx7" Mar 16 00:15:56 crc kubenswrapper[4816]: I0316 00:15:56.480667 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-hmzx7" Mar 16 00:15:56 crc kubenswrapper[4816]: I0316 00:15:56.707241 4816 generic.go:334] "Generic (PLEG): container finished" podID="63fdf2e4-cd4d-40ad-ae9e-60642f5f47fa" containerID="4810564b12a91bcd6219665353971cf8fa3c739485d9b178416c0e173435b096" exitCode=0 Mar 16 00:15:56 crc kubenswrapper[4816]: I0316 00:15:56.707348 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-nvgvc" event={"ID":"63fdf2e4-cd4d-40ad-ae9e-60642f5f47fa","Type":"ContainerDied","Data":"4810564b12a91bcd6219665353971cf8fa3c739485d9b178416c0e173435b096"} Mar 16 00:15:56 crc kubenswrapper[4816]: I0316 00:15:56.707402 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-nvgvc" event={"ID":"63fdf2e4-cd4d-40ad-ae9e-60642f5f47fa","Type":"ContainerStarted","Data":"3956bdc0939ca6c80a18b82143c55a4cfebb9af362a0d61193b0fe36b4f051bd"} Mar 16 00:15:56 crc kubenswrapper[4816]: I0316 00:15:56.710624 4816 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 16 00:15:56 crc kubenswrapper[4816]: I0316 00:15:56.892605 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-hmzx7"] Mar 16 00:15:57 crc kubenswrapper[4816]: I0316 00:15:57.714460 4816 generic.go:334] "Generic (PLEG): container finished" podID="6df1dc3a-6abd-4ffc-b27b-e66f281ed273" containerID="6514aa2b53586e6671a19991e43ae80c50682b23666589c71c32e64209a97e8f" exitCode=0 Mar 16 00:15:57 crc kubenswrapper[4816]: I0316 00:15:57.714810 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hmzx7" event={"ID":"6df1dc3a-6abd-4ffc-b27b-e66f281ed273","Type":"ContainerDied","Data":"6514aa2b53586e6671a19991e43ae80c50682b23666589c71c32e64209a97e8f"} Mar 16 00:15:57 crc kubenswrapper[4816]: I0316 00:15:57.714836 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hmzx7" event={"ID":"6df1dc3a-6abd-4ffc-b27b-e66f281ed273","Type":"ContainerStarted","Data":"92f2128bfb20f3e54453e51680d3555a1725778a5428366b87af7ee5ed62f8a2"} Mar 16 00:15:57 crc kubenswrapper[4816]: I0316 00:15:57.951467 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-8jcgw"] Mar 16 00:15:57 crc kubenswrapper[4816]: I0316 00:15:57.953035 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-8jcgw" Mar 16 00:15:57 crc kubenswrapper[4816]: I0316 00:15:57.959361 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Mar 16 00:15:57 crc kubenswrapper[4816]: I0316 00:15:57.963121 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-8jcgw"] Mar 16 00:15:57 crc kubenswrapper[4816]: I0316 00:15:57.992101 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/249ae30f-a698-43f3-9464-24868dff2ad6-utilities\") pod \"community-operators-8jcgw\" (UID: \"249ae30f-a698-43f3-9464-24868dff2ad6\") " pod="openshift-marketplace/community-operators-8jcgw" Mar 16 00:15:57 crc kubenswrapper[4816]: I0316 00:15:57.992159 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/249ae30f-a698-43f3-9464-24868dff2ad6-catalog-content\") pod \"community-operators-8jcgw\" (UID: \"249ae30f-a698-43f3-9464-24868dff2ad6\") " pod="openshift-marketplace/community-operators-8jcgw" Mar 16 00:15:57 crc kubenswrapper[4816]: I0316 00:15:57.992391 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zd6xx\" (UniqueName: \"kubernetes.io/projected/249ae30f-a698-43f3-9464-24868dff2ad6-kube-api-access-zd6xx\") pod \"community-operators-8jcgw\" (UID: \"249ae30f-a698-43f3-9464-24868dff2ad6\") " pod="openshift-marketplace/community-operators-8jcgw" Mar 16 00:15:58 crc kubenswrapper[4816]: I0316 00:15:58.093532 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zd6xx\" (UniqueName: \"kubernetes.io/projected/249ae30f-a698-43f3-9464-24868dff2ad6-kube-api-access-zd6xx\") pod \"community-operators-8jcgw\" (UID: \"249ae30f-a698-43f3-9464-24868dff2ad6\") " pod="openshift-marketplace/community-operators-8jcgw" Mar 16 00:15:58 crc kubenswrapper[4816]: I0316 00:15:58.093613 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/249ae30f-a698-43f3-9464-24868dff2ad6-utilities\") pod \"community-operators-8jcgw\" (UID: \"249ae30f-a698-43f3-9464-24868dff2ad6\") " pod="openshift-marketplace/community-operators-8jcgw" Mar 16 00:15:58 crc kubenswrapper[4816]: I0316 00:15:58.093637 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/249ae30f-a698-43f3-9464-24868dff2ad6-catalog-content\") pod \"community-operators-8jcgw\" (UID: \"249ae30f-a698-43f3-9464-24868dff2ad6\") " pod="openshift-marketplace/community-operators-8jcgw" Mar 16 00:15:58 crc kubenswrapper[4816]: I0316 00:15:58.094085 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/249ae30f-a698-43f3-9464-24868dff2ad6-catalog-content\") pod \"community-operators-8jcgw\" (UID: \"249ae30f-a698-43f3-9464-24868dff2ad6\") " pod="openshift-marketplace/community-operators-8jcgw" Mar 16 00:15:58 crc kubenswrapper[4816]: I0316 00:15:58.094489 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/249ae30f-a698-43f3-9464-24868dff2ad6-utilities\") pod \"community-operators-8jcgw\" (UID: \"249ae30f-a698-43f3-9464-24868dff2ad6\") " pod="openshift-marketplace/community-operators-8jcgw" Mar 16 00:15:58 crc kubenswrapper[4816]: I0316 00:15:58.117955 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zd6xx\" (UniqueName: \"kubernetes.io/projected/249ae30f-a698-43f3-9464-24868dff2ad6-kube-api-access-zd6xx\") pod \"community-operators-8jcgw\" (UID: \"249ae30f-a698-43f3-9464-24868dff2ad6\") " pod="openshift-marketplace/community-operators-8jcgw" Mar 16 00:15:58 crc kubenswrapper[4816]: I0316 00:15:58.287172 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-8jcgw" Mar 16 00:15:58 crc kubenswrapper[4816]: I0316 00:15:58.547407 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-6z2gx"] Mar 16 00:15:58 crc kubenswrapper[4816]: I0316 00:15:58.548886 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-6z2gx" Mar 16 00:15:58 crc kubenswrapper[4816]: I0316 00:15:58.556978 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Mar 16 00:15:58 crc kubenswrapper[4816]: I0316 00:15:58.563352 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-6z2gx"] Mar 16 00:15:58 crc kubenswrapper[4816]: I0316 00:15:58.600973 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8pfdf\" (UniqueName: \"kubernetes.io/projected/9d1b1f79-de52-4ade-9a72-69b86c55e8ff-kube-api-access-8pfdf\") pod \"certified-operators-6z2gx\" (UID: \"9d1b1f79-de52-4ade-9a72-69b86c55e8ff\") " pod="openshift-marketplace/certified-operators-6z2gx" Mar 16 00:15:58 crc kubenswrapper[4816]: I0316 00:15:58.601133 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9d1b1f79-de52-4ade-9a72-69b86c55e8ff-utilities\") pod \"certified-operators-6z2gx\" (UID: \"9d1b1f79-de52-4ade-9a72-69b86c55e8ff\") " pod="openshift-marketplace/certified-operators-6z2gx" Mar 16 00:15:58 crc kubenswrapper[4816]: I0316 00:15:58.601213 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9d1b1f79-de52-4ade-9a72-69b86c55e8ff-catalog-content\") pod \"certified-operators-6z2gx\" (UID: \"9d1b1f79-de52-4ade-9a72-69b86c55e8ff\") " pod="openshift-marketplace/certified-operators-6z2gx" Mar 16 00:15:58 crc kubenswrapper[4816]: I0316 00:15:58.700052 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-8jcgw"] Mar 16 00:15:58 crc kubenswrapper[4816]: I0316 00:15:58.702300 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9d1b1f79-de52-4ade-9a72-69b86c55e8ff-utilities\") pod \"certified-operators-6z2gx\" (UID: \"9d1b1f79-de52-4ade-9a72-69b86c55e8ff\") " pod="openshift-marketplace/certified-operators-6z2gx" Mar 16 00:15:58 crc kubenswrapper[4816]: I0316 00:15:58.702398 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9d1b1f79-de52-4ade-9a72-69b86c55e8ff-catalog-content\") pod \"certified-operators-6z2gx\" (UID: \"9d1b1f79-de52-4ade-9a72-69b86c55e8ff\") " pod="openshift-marketplace/certified-operators-6z2gx" Mar 16 00:15:58 crc kubenswrapper[4816]: I0316 00:15:58.702449 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8pfdf\" (UniqueName: \"kubernetes.io/projected/9d1b1f79-de52-4ade-9a72-69b86c55e8ff-kube-api-access-8pfdf\") pod \"certified-operators-6z2gx\" (UID: \"9d1b1f79-de52-4ade-9a72-69b86c55e8ff\") " pod="openshift-marketplace/certified-operators-6z2gx" Mar 16 00:15:58 crc kubenswrapper[4816]: I0316 00:15:58.702923 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9d1b1f79-de52-4ade-9a72-69b86c55e8ff-utilities\") pod \"certified-operators-6z2gx\" (UID: \"9d1b1f79-de52-4ade-9a72-69b86c55e8ff\") " pod="openshift-marketplace/certified-operators-6z2gx" Mar 16 00:15:58 crc kubenswrapper[4816]: I0316 00:15:58.703018 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9d1b1f79-de52-4ade-9a72-69b86c55e8ff-catalog-content\") pod \"certified-operators-6z2gx\" (UID: \"9d1b1f79-de52-4ade-9a72-69b86c55e8ff\") " pod="openshift-marketplace/certified-operators-6z2gx" Mar 16 00:15:58 crc kubenswrapper[4816]: W0316 00:15:58.707649 4816 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod249ae30f_a698_43f3_9464_24868dff2ad6.slice/crio-77c4301a4c4eb164930f0ce4e68ef45c27dbf7e455ddd8186c4d632ca28ba588 WatchSource:0}: Error finding container 77c4301a4c4eb164930f0ce4e68ef45c27dbf7e455ddd8186c4d632ca28ba588: Status 404 returned error can't find the container with id 77c4301a4c4eb164930f0ce4e68ef45c27dbf7e455ddd8186c4d632ca28ba588 Mar 16 00:15:58 crc kubenswrapper[4816]: I0316 00:15:58.722752 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8jcgw" event={"ID":"249ae30f-a698-43f3-9464-24868dff2ad6","Type":"ContainerStarted","Data":"77c4301a4c4eb164930f0ce4e68ef45c27dbf7e455ddd8186c4d632ca28ba588"} Mar 16 00:15:58 crc kubenswrapper[4816]: I0316 00:15:58.723388 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8pfdf\" (UniqueName: \"kubernetes.io/projected/9d1b1f79-de52-4ade-9a72-69b86c55e8ff-kube-api-access-8pfdf\") pod \"certified-operators-6z2gx\" (UID: \"9d1b1f79-de52-4ade-9a72-69b86c55e8ff\") " pod="openshift-marketplace/certified-operators-6z2gx" Mar 16 00:15:58 crc kubenswrapper[4816]: I0316 00:15:58.724906 4816 generic.go:334] "Generic (PLEG): container finished" podID="63fdf2e4-cd4d-40ad-ae9e-60642f5f47fa" containerID="6af5777618386b35653fedfb44583f6c085ee0d0796e2336a254ffa7ee599f64" exitCode=0 Mar 16 00:15:58 crc kubenswrapper[4816]: I0316 00:15:58.724939 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-nvgvc" event={"ID":"63fdf2e4-cd4d-40ad-ae9e-60642f5f47fa","Type":"ContainerDied","Data":"6af5777618386b35653fedfb44583f6c085ee0d0796e2336a254ffa7ee599f64"} Mar 16 00:15:58 crc kubenswrapper[4816]: I0316 00:15:58.876072 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-6z2gx" Mar 16 00:15:59 crc kubenswrapper[4816]: I0316 00:15:59.257034 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-6z2gx"] Mar 16 00:15:59 crc kubenswrapper[4816]: W0316 00:15:59.259481 4816 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9d1b1f79_de52_4ade_9a72_69b86c55e8ff.slice/crio-d6634923f047727775df02d4d821820bfe16c08bbd5f740d3677d67d9b993223 WatchSource:0}: Error finding container d6634923f047727775df02d4d821820bfe16c08bbd5f740d3677d67d9b993223: Status 404 returned error can't find the container with id d6634923f047727775df02d4d821820bfe16c08bbd5f740d3677d67d9b993223 Mar 16 00:15:59 crc kubenswrapper[4816]: I0316 00:15:59.738340 4816 generic.go:334] "Generic (PLEG): container finished" podID="9d1b1f79-de52-4ade-9a72-69b86c55e8ff" containerID="5404e0e23149a051c63ece5c9f7e35a40c4b4a2c3bd255f946675c4984f1d005" exitCode=0 Mar 16 00:15:59 crc kubenswrapper[4816]: I0316 00:15:59.738401 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6z2gx" event={"ID":"9d1b1f79-de52-4ade-9a72-69b86c55e8ff","Type":"ContainerDied","Data":"5404e0e23149a051c63ece5c9f7e35a40c4b4a2c3bd255f946675c4984f1d005"} Mar 16 00:15:59 crc kubenswrapper[4816]: I0316 00:15:59.738440 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6z2gx" event={"ID":"9d1b1f79-de52-4ade-9a72-69b86c55e8ff","Type":"ContainerStarted","Data":"d6634923f047727775df02d4d821820bfe16c08bbd5f740d3677d67d9b993223"} Mar 16 00:15:59 crc kubenswrapper[4816]: I0316 00:15:59.742283 4816 generic.go:334] "Generic (PLEG): container finished" podID="249ae30f-a698-43f3-9464-24868dff2ad6" containerID="04a1a6b256be09e48052d9b9924ffda3f589a8015e9d1bf56580d8d275ea83bd" exitCode=0 Mar 16 00:15:59 crc kubenswrapper[4816]: I0316 00:15:59.742325 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8jcgw" event={"ID":"249ae30f-a698-43f3-9464-24868dff2ad6","Type":"ContainerDied","Data":"04a1a6b256be09e48052d9b9924ffda3f589a8015e9d1bf56580d8d275ea83bd"} Mar 16 00:15:59 crc kubenswrapper[4816]: I0316 00:15:59.744713 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-nvgvc" event={"ID":"63fdf2e4-cd4d-40ad-ae9e-60642f5f47fa","Type":"ContainerStarted","Data":"5794c299dc8a2535c1228578a35665f886997b739f9467f169232f1cb504746a"} Mar 16 00:15:59 crc kubenswrapper[4816]: I0316 00:15:59.748048 4816 generic.go:334] "Generic (PLEG): container finished" podID="6df1dc3a-6abd-4ffc-b27b-e66f281ed273" containerID="c323718467b7a05f9e466cb8c30f184578db98b430cc00f4184970fe4c9c9980" exitCode=0 Mar 16 00:15:59 crc kubenswrapper[4816]: I0316 00:15:59.748070 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hmzx7" event={"ID":"6df1dc3a-6abd-4ffc-b27b-e66f281ed273","Type":"ContainerDied","Data":"c323718467b7a05f9e466cb8c30f184578db98b430cc00f4184970fe4c9c9980"} Mar 16 00:16:00 crc kubenswrapper[4816]: I0316 00:16:00.137849 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-nvgvc" podStartSLOduration=2.5773334180000003 podStartE2EDuration="5.137829848s" podCreationTimestamp="2026-03-16 00:15:55 +0000 UTC" firstStartedPulling="2026-03-16 00:15:56.708665323 +0000 UTC m=+549.804965286" lastFinishedPulling="2026-03-16 00:15:59.269161763 +0000 UTC m=+552.365461716" observedRunningTime="2026-03-16 00:15:59.813235378 +0000 UTC m=+552.909535341" watchObservedRunningTime="2026-03-16 00:16:00.137829848 +0000 UTC m=+553.234129811" Mar 16 00:16:00 crc kubenswrapper[4816]: I0316 00:16:00.141714 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29560336-fncq8"] Mar 16 00:16:00 crc kubenswrapper[4816]: I0316 00:16:00.142510 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29560336-fncq8" Mar 16 00:16:00 crc kubenswrapper[4816]: I0316 00:16:00.144671 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 16 00:16:00 crc kubenswrapper[4816]: I0316 00:16:00.144985 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 16 00:16:00 crc kubenswrapper[4816]: I0316 00:16:00.145857 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-8hc2r" Mar 16 00:16:00 crc kubenswrapper[4816]: I0316 00:16:00.157501 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29560336-fncq8"] Mar 16 00:16:00 crc kubenswrapper[4816]: I0316 00:16:00.221730 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lcq2v\" (UniqueName: \"kubernetes.io/projected/b478a542-14c7-4cca-9f95-64766b34df27-kube-api-access-lcq2v\") pod \"auto-csr-approver-29560336-fncq8\" (UID: \"b478a542-14c7-4cca-9f95-64766b34df27\") " pod="openshift-infra/auto-csr-approver-29560336-fncq8" Mar 16 00:16:00 crc kubenswrapper[4816]: I0316 00:16:00.322592 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lcq2v\" (UniqueName: \"kubernetes.io/projected/b478a542-14c7-4cca-9f95-64766b34df27-kube-api-access-lcq2v\") pod \"auto-csr-approver-29560336-fncq8\" (UID: \"b478a542-14c7-4cca-9f95-64766b34df27\") " pod="openshift-infra/auto-csr-approver-29560336-fncq8" Mar 16 00:16:00 crc kubenswrapper[4816]: I0316 00:16:00.346126 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lcq2v\" (UniqueName: \"kubernetes.io/projected/b478a542-14c7-4cca-9f95-64766b34df27-kube-api-access-lcq2v\") pod \"auto-csr-approver-29560336-fncq8\" (UID: \"b478a542-14c7-4cca-9f95-64766b34df27\") " pod="openshift-infra/auto-csr-approver-29560336-fncq8" Mar 16 00:16:00 crc kubenswrapper[4816]: I0316 00:16:00.457929 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29560336-fncq8" Mar 16 00:16:00 crc kubenswrapper[4816]: I0316 00:16:00.661013 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29560336-fncq8"] Mar 16 00:16:00 crc kubenswrapper[4816]: I0316 00:16:00.754466 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29560336-fncq8" event={"ID":"b478a542-14c7-4cca-9f95-64766b34df27","Type":"ContainerStarted","Data":"f07774f669536f75976daaac2514712b31c1e4c589e53aab8a4efeae7ad978ba"} Mar 16 00:16:00 crc kubenswrapper[4816]: I0316 00:16:00.756269 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hmzx7" event={"ID":"6df1dc3a-6abd-4ffc-b27b-e66f281ed273","Type":"ContainerStarted","Data":"23f016db7cf617088f08092141274ee3b1304f41b7b54130bb94c977188f621d"} Mar 16 00:16:00 crc kubenswrapper[4816]: I0316 00:16:00.779565 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-hmzx7" podStartSLOduration=2.112517182 podStartE2EDuration="4.779512069s" podCreationTimestamp="2026-03-16 00:15:56 +0000 UTC" firstStartedPulling="2026-03-16 00:15:57.72125218 +0000 UTC m=+550.817552133" lastFinishedPulling="2026-03-16 00:16:00.388247067 +0000 UTC m=+553.484547020" observedRunningTime="2026-03-16 00:16:00.774463121 +0000 UTC m=+553.870763094" watchObservedRunningTime="2026-03-16 00:16:00.779512069 +0000 UTC m=+553.875812022" Mar 16 00:16:01 crc kubenswrapper[4816]: I0316 00:16:01.764051 4816 generic.go:334] "Generic (PLEG): container finished" podID="9d1b1f79-de52-4ade-9a72-69b86c55e8ff" containerID="083bb242e6bde224d00d304d1f2d497cdabe3bfd93e6548b23a50066c9a1dbce" exitCode=0 Mar 16 00:16:01 crc kubenswrapper[4816]: I0316 00:16:01.764177 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6z2gx" event={"ID":"9d1b1f79-de52-4ade-9a72-69b86c55e8ff","Type":"ContainerDied","Data":"083bb242e6bde224d00d304d1f2d497cdabe3bfd93e6548b23a50066c9a1dbce"} Mar 16 00:16:01 crc kubenswrapper[4816]: I0316 00:16:01.775277 4816 generic.go:334] "Generic (PLEG): container finished" podID="249ae30f-a698-43f3-9464-24868dff2ad6" containerID="57ff0505ecaf816a1504b57a15468a94a9a38d0dae1d76628212d7c0e0c8e261" exitCode=0 Mar 16 00:16:01 crc kubenswrapper[4816]: I0316 00:16:01.775331 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8jcgw" event={"ID":"249ae30f-a698-43f3-9464-24868dff2ad6","Type":"ContainerDied","Data":"57ff0505ecaf816a1504b57a15468a94a9a38d0dae1d76628212d7c0e0c8e261"} Mar 16 00:16:02 crc kubenswrapper[4816]: I0316 00:16:02.783286 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6z2gx" event={"ID":"9d1b1f79-de52-4ade-9a72-69b86c55e8ff","Type":"ContainerStarted","Data":"700492b2ebbd124fe994672d95f3a4f2989857c4185a74f7530dda83e9156f15"} Mar 16 00:16:02 crc kubenswrapper[4816]: I0316 00:16:02.786852 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8jcgw" event={"ID":"249ae30f-a698-43f3-9464-24868dff2ad6","Type":"ContainerStarted","Data":"29a2a6234324111f47ddb9745f9b567e930dbd3b8bac5490a80925c5a3ec4d8d"} Mar 16 00:16:02 crc kubenswrapper[4816]: I0316 00:16:02.788885 4816 generic.go:334] "Generic (PLEG): container finished" podID="b478a542-14c7-4cca-9f95-64766b34df27" containerID="185e1a33c845773d7893f16759f110b3a4a2b357c62cdafa5e5060cabc62a64e" exitCode=0 Mar 16 00:16:02 crc kubenswrapper[4816]: I0316 00:16:02.788930 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29560336-fncq8" event={"ID":"b478a542-14c7-4cca-9f95-64766b34df27","Type":"ContainerDied","Data":"185e1a33c845773d7893f16759f110b3a4a2b357c62cdafa5e5060cabc62a64e"} Mar 16 00:16:02 crc kubenswrapper[4816]: I0316 00:16:02.804639 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-6z2gx" podStartSLOduration=2.105553273 podStartE2EDuration="4.804622079s" podCreationTimestamp="2026-03-16 00:15:58 +0000 UTC" firstStartedPulling="2026-03-16 00:15:59.740560491 +0000 UTC m=+552.836860444" lastFinishedPulling="2026-03-16 00:16:02.439629297 +0000 UTC m=+555.535929250" observedRunningTime="2026-03-16 00:16:02.801489017 +0000 UTC m=+555.897788970" watchObservedRunningTime="2026-03-16 00:16:02.804622079 +0000 UTC m=+555.900922032" Mar 16 00:16:02 crc kubenswrapper[4816]: I0316 00:16:02.821354 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-8jcgw" podStartSLOduration=3.01836504 podStartE2EDuration="5.821334138s" podCreationTimestamp="2026-03-16 00:15:57 +0000 UTC" firstStartedPulling="2026-03-16 00:15:59.752770298 +0000 UTC m=+552.849070251" lastFinishedPulling="2026-03-16 00:16:02.555739396 +0000 UTC m=+555.652039349" observedRunningTime="2026-03-16 00:16:02.820564626 +0000 UTC m=+555.916864589" watchObservedRunningTime="2026-03-16 00:16:02.821334138 +0000 UTC m=+555.917634101" Mar 16 00:16:04 crc kubenswrapper[4816]: I0316 00:16:04.070567 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29560336-fncq8" Mar 16 00:16:04 crc kubenswrapper[4816]: I0316 00:16:04.178847 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lcq2v\" (UniqueName: \"kubernetes.io/projected/b478a542-14c7-4cca-9f95-64766b34df27-kube-api-access-lcq2v\") pod \"b478a542-14c7-4cca-9f95-64766b34df27\" (UID: \"b478a542-14c7-4cca-9f95-64766b34df27\") " Mar 16 00:16:04 crc kubenswrapper[4816]: I0316 00:16:04.184779 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b478a542-14c7-4cca-9f95-64766b34df27-kube-api-access-lcq2v" (OuterVolumeSpecName: "kube-api-access-lcq2v") pod "b478a542-14c7-4cca-9f95-64766b34df27" (UID: "b478a542-14c7-4cca-9f95-64766b34df27"). InnerVolumeSpecName "kube-api-access-lcq2v". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:16:04 crc kubenswrapper[4816]: I0316 00:16:04.280749 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lcq2v\" (UniqueName: \"kubernetes.io/projected/b478a542-14c7-4cca-9f95-64766b34df27-kube-api-access-lcq2v\") on node \"crc\" DevicePath \"\"" Mar 16 00:16:04 crc kubenswrapper[4816]: I0316 00:16:04.802424 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29560336-fncq8" event={"ID":"b478a542-14c7-4cca-9f95-64766b34df27","Type":"ContainerDied","Data":"f07774f669536f75976daaac2514712b31c1e4c589e53aab8a4efeae7ad978ba"} Mar 16 00:16:04 crc kubenswrapper[4816]: I0316 00:16:04.802469 4816 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f07774f669536f75976daaac2514712b31c1e4c589e53aab8a4efeae7ad978ba" Mar 16 00:16:04 crc kubenswrapper[4816]: I0316 00:16:04.802473 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29560336-fncq8" Mar 16 00:16:05 crc kubenswrapper[4816]: I0316 00:16:05.124212 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29560330-44pts"] Mar 16 00:16:05 crc kubenswrapper[4816]: I0316 00:16:05.127926 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29560330-44pts"] Mar 16 00:16:05 crc kubenswrapper[4816]: I0316 00:16:05.675606 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="55e76e8f-7d69-4f55-81f8-45c9c612876b" path="/var/lib/kubelet/pods/55e76e8f-7d69-4f55-81f8-45c9c612876b/volumes" Mar 16 00:16:05 crc kubenswrapper[4816]: I0316 00:16:05.865526 4816 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-nvgvc" Mar 16 00:16:05 crc kubenswrapper[4816]: I0316 00:16:05.865581 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-nvgvc" Mar 16 00:16:05 crc kubenswrapper[4816]: I0316 00:16:05.908397 4816 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-nvgvc" Mar 16 00:16:06 crc kubenswrapper[4816]: I0316 00:16:06.481868 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-hmzx7" Mar 16 00:16:06 crc kubenswrapper[4816]: I0316 00:16:06.482254 4816 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-hmzx7" Mar 16 00:16:06 crc kubenswrapper[4816]: I0316 00:16:06.867279 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-nvgvc" Mar 16 00:16:07 crc kubenswrapper[4816]: I0316 00:16:07.523051 4816 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-hmzx7" podUID="6df1dc3a-6abd-4ffc-b27b-e66f281ed273" containerName="registry-server" probeResult="failure" output=< Mar 16 00:16:07 crc kubenswrapper[4816]: timeout: failed to connect service ":50051" within 1s Mar 16 00:16:07 crc kubenswrapper[4816]: > Mar 16 00:16:08 crc kubenswrapper[4816]: I0316 00:16:08.288177 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-8jcgw" Mar 16 00:16:08 crc kubenswrapper[4816]: I0316 00:16:08.288232 4816 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-8jcgw" Mar 16 00:16:08 crc kubenswrapper[4816]: I0316 00:16:08.324220 4816 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-8jcgw" Mar 16 00:16:08 crc kubenswrapper[4816]: I0316 00:16:08.876676 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-6z2gx" Mar 16 00:16:08 crc kubenswrapper[4816]: I0316 00:16:08.876751 4816 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-6z2gx" Mar 16 00:16:08 crc kubenswrapper[4816]: I0316 00:16:08.899049 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-8jcgw" Mar 16 00:16:08 crc kubenswrapper[4816]: I0316 00:16:08.937193 4816 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-6z2gx" Mar 16 00:16:09 crc kubenswrapper[4816]: I0316 00:16:09.904149 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-6z2gx" Mar 16 00:16:16 crc kubenswrapper[4816]: I0316 00:16:16.540361 4816 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-hmzx7" Mar 16 00:16:16 crc kubenswrapper[4816]: I0316 00:16:16.582613 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-hmzx7" Mar 16 00:17:03 crc kubenswrapper[4816]: I0316 00:17:03.255048 4816 scope.go:117] "RemoveContainer" containerID="a0546877ac51e8fef907f2152b03530a1aaadfb1ec0bb2cad119c19beb5651ba" Mar 16 00:17:03 crc kubenswrapper[4816]: I0316 00:17:03.302580 4816 scope.go:117] "RemoveContainer" containerID="5259cd97d29c896bcf8ba7141fe44641e990295b28288f54dfe4315de536ad23" Mar 16 00:18:00 crc kubenswrapper[4816]: I0316 00:18:00.139181 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29560338-8bkf9"] Mar 16 00:18:00 crc kubenswrapper[4816]: E0316 00:18:00.140026 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b478a542-14c7-4cca-9f95-64766b34df27" containerName="oc" Mar 16 00:18:00 crc kubenswrapper[4816]: I0316 00:18:00.140044 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="b478a542-14c7-4cca-9f95-64766b34df27" containerName="oc" Mar 16 00:18:00 crc kubenswrapper[4816]: I0316 00:18:00.140196 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="b478a542-14c7-4cca-9f95-64766b34df27" containerName="oc" Mar 16 00:18:00 crc kubenswrapper[4816]: I0316 00:18:00.141047 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29560338-8bkf9" Mar 16 00:18:00 crc kubenswrapper[4816]: I0316 00:18:00.145977 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 16 00:18:00 crc kubenswrapper[4816]: I0316 00:18:00.145977 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-8hc2r" Mar 16 00:18:00 crc kubenswrapper[4816]: I0316 00:18:00.146053 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 16 00:18:00 crc kubenswrapper[4816]: I0316 00:18:00.146475 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29560338-8bkf9"] Mar 16 00:18:00 crc kubenswrapper[4816]: I0316 00:18:00.261056 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p2gzk\" (UniqueName: \"kubernetes.io/projected/6cfda38e-dbdc-4b42-8a0d-964103ee01cd-kube-api-access-p2gzk\") pod \"auto-csr-approver-29560338-8bkf9\" (UID: \"6cfda38e-dbdc-4b42-8a0d-964103ee01cd\") " pod="openshift-infra/auto-csr-approver-29560338-8bkf9" Mar 16 00:18:00 crc kubenswrapper[4816]: I0316 00:18:00.361826 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p2gzk\" (UniqueName: \"kubernetes.io/projected/6cfda38e-dbdc-4b42-8a0d-964103ee01cd-kube-api-access-p2gzk\") pod \"auto-csr-approver-29560338-8bkf9\" (UID: \"6cfda38e-dbdc-4b42-8a0d-964103ee01cd\") " pod="openshift-infra/auto-csr-approver-29560338-8bkf9" Mar 16 00:18:00 crc kubenswrapper[4816]: I0316 00:18:00.396093 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p2gzk\" (UniqueName: \"kubernetes.io/projected/6cfda38e-dbdc-4b42-8a0d-964103ee01cd-kube-api-access-p2gzk\") pod \"auto-csr-approver-29560338-8bkf9\" (UID: \"6cfda38e-dbdc-4b42-8a0d-964103ee01cd\") " pod="openshift-infra/auto-csr-approver-29560338-8bkf9" Mar 16 00:18:00 crc kubenswrapper[4816]: I0316 00:18:00.512093 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29560338-8bkf9" Mar 16 00:18:00 crc kubenswrapper[4816]: I0316 00:18:00.729331 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29560338-8bkf9"] Mar 16 00:18:01 crc kubenswrapper[4816]: I0316 00:18:01.557295 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29560338-8bkf9" event={"ID":"6cfda38e-dbdc-4b42-8a0d-964103ee01cd","Type":"ContainerStarted","Data":"c3e79faa65c4ca4a97231baa5de42757fa1bf5ee8bd498027cd3d986320c200c"} Mar 16 00:18:01 crc kubenswrapper[4816]: I0316 00:18:01.863614 4816 patch_prober.go:28] interesting pod/machine-config-daemon-jrdcz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 16 00:18:01 crc kubenswrapper[4816]: I0316 00:18:01.863686 4816 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jrdcz" podUID="dd08ece2-7636-4966-973a-e96a34b70b53" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 16 00:18:02 crc kubenswrapper[4816]: I0316 00:18:02.566385 4816 generic.go:334] "Generic (PLEG): container finished" podID="6cfda38e-dbdc-4b42-8a0d-964103ee01cd" containerID="b862cec0bd3d63e5c9dfe4071f9f4f3cb758b083bc3f73a5460bc03b5c4debd8" exitCode=0 Mar 16 00:18:02 crc kubenswrapper[4816]: I0316 00:18:02.566468 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29560338-8bkf9" event={"ID":"6cfda38e-dbdc-4b42-8a0d-964103ee01cd","Type":"ContainerDied","Data":"b862cec0bd3d63e5c9dfe4071f9f4f3cb758b083bc3f73a5460bc03b5c4debd8"} Mar 16 00:18:03 crc kubenswrapper[4816]: I0316 00:18:03.763532 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29560338-8bkf9" Mar 16 00:18:03 crc kubenswrapper[4816]: I0316 00:18:03.909660 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p2gzk\" (UniqueName: \"kubernetes.io/projected/6cfda38e-dbdc-4b42-8a0d-964103ee01cd-kube-api-access-p2gzk\") pod \"6cfda38e-dbdc-4b42-8a0d-964103ee01cd\" (UID: \"6cfda38e-dbdc-4b42-8a0d-964103ee01cd\") " Mar 16 00:18:03 crc kubenswrapper[4816]: I0316 00:18:03.916229 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6cfda38e-dbdc-4b42-8a0d-964103ee01cd-kube-api-access-p2gzk" (OuterVolumeSpecName: "kube-api-access-p2gzk") pod "6cfda38e-dbdc-4b42-8a0d-964103ee01cd" (UID: "6cfda38e-dbdc-4b42-8a0d-964103ee01cd"). InnerVolumeSpecName "kube-api-access-p2gzk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:18:04 crc kubenswrapper[4816]: I0316 00:18:04.012191 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p2gzk\" (UniqueName: \"kubernetes.io/projected/6cfda38e-dbdc-4b42-8a0d-964103ee01cd-kube-api-access-p2gzk\") on node \"crc\" DevicePath \"\"" Mar 16 00:18:04 crc kubenswrapper[4816]: I0316 00:18:04.578200 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29560338-8bkf9" event={"ID":"6cfda38e-dbdc-4b42-8a0d-964103ee01cd","Type":"ContainerDied","Data":"c3e79faa65c4ca4a97231baa5de42757fa1bf5ee8bd498027cd3d986320c200c"} Mar 16 00:18:04 crc kubenswrapper[4816]: I0316 00:18:04.578495 4816 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c3e79faa65c4ca4a97231baa5de42757fa1bf5ee8bd498027cd3d986320c200c" Mar 16 00:18:04 crc kubenswrapper[4816]: I0316 00:18:04.578263 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29560338-8bkf9" Mar 16 00:18:04 crc kubenswrapper[4816]: I0316 00:18:04.818892 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29560332-wb8kg"] Mar 16 00:18:04 crc kubenswrapper[4816]: I0316 00:18:04.828991 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29560332-wb8kg"] Mar 16 00:18:05 crc kubenswrapper[4816]: I0316 00:18:05.676914 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e570fb38-3e4c-4b9b-82d9-878ec6a5306f" path="/var/lib/kubelet/pods/e570fb38-3e4c-4b9b-82d9-878ec6a5306f/volumes" Mar 16 00:18:31 crc kubenswrapper[4816]: I0316 00:18:31.863750 4816 patch_prober.go:28] interesting pod/machine-config-daemon-jrdcz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 16 00:18:31 crc kubenswrapper[4816]: I0316 00:18:31.864213 4816 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jrdcz" podUID="dd08ece2-7636-4966-973a-e96a34b70b53" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 16 00:19:01 crc kubenswrapper[4816]: I0316 00:19:01.863667 4816 patch_prober.go:28] interesting pod/machine-config-daemon-jrdcz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 16 00:19:01 crc kubenswrapper[4816]: I0316 00:19:01.864366 4816 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jrdcz" podUID="dd08ece2-7636-4966-973a-e96a34b70b53" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 16 00:19:01 crc kubenswrapper[4816]: I0316 00:19:01.864431 4816 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-jrdcz" Mar 16 00:19:01 crc kubenswrapper[4816]: I0316 00:19:01.865427 4816 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"054dcd9294a0533063364a3ea7e009e513fea0236f1afad37201a02a85a0eee4"} pod="openshift-machine-config-operator/machine-config-daemon-jrdcz" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 16 00:19:01 crc kubenswrapper[4816]: I0316 00:19:01.865515 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-jrdcz" podUID="dd08ece2-7636-4966-973a-e96a34b70b53" containerName="machine-config-daemon" containerID="cri-o://054dcd9294a0533063364a3ea7e009e513fea0236f1afad37201a02a85a0eee4" gracePeriod=600 Mar 16 00:19:02 crc kubenswrapper[4816]: I0316 00:19:02.947621 4816 generic.go:334] "Generic (PLEG): container finished" podID="dd08ece2-7636-4966-973a-e96a34b70b53" containerID="054dcd9294a0533063364a3ea7e009e513fea0236f1afad37201a02a85a0eee4" exitCode=0 Mar 16 00:19:02 crc kubenswrapper[4816]: I0316 00:19:02.947696 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-jrdcz" event={"ID":"dd08ece2-7636-4966-973a-e96a34b70b53","Type":"ContainerDied","Data":"054dcd9294a0533063364a3ea7e009e513fea0236f1afad37201a02a85a0eee4"} Mar 16 00:19:02 crc kubenswrapper[4816]: I0316 00:19:02.947749 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-jrdcz" event={"ID":"dd08ece2-7636-4966-973a-e96a34b70b53","Type":"ContainerStarted","Data":"d940a23c182654ea98c304045d406af01d62b828901045324158f53e5e4988ad"} Mar 16 00:19:02 crc kubenswrapper[4816]: I0316 00:19:02.947781 4816 scope.go:117] "RemoveContainer" containerID="8214b8a7550606e587b215ee7c72e3638e054dd083cb6fa7b37990d33bec509b" Mar 16 00:19:03 crc kubenswrapper[4816]: I0316 00:19:03.361666 4816 scope.go:117] "RemoveContainer" containerID="9cbc70d2e0b275d40fbacb6be14712c60796f46bdd73e4f108a004a37c120cb9" Mar 16 00:19:03 crc kubenswrapper[4816]: I0316 00:19:03.375362 4816 scope.go:117] "RemoveContainer" containerID="92ce11f74b2381302bcae2babd96b3eab76e1d28bfb034c70d8b99be8178dac1" Mar 16 00:19:03 crc kubenswrapper[4816]: I0316 00:19:03.393839 4816 scope.go:117] "RemoveContainer" containerID="c422afc027f6d729cf317777cce7cb5de5ed92334512743c933f67e04e4724ef" Mar 16 00:20:00 crc kubenswrapper[4816]: I0316 00:20:00.143562 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29560340-pmlmw"] Mar 16 00:20:00 crc kubenswrapper[4816]: E0316 00:20:00.145944 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6cfda38e-dbdc-4b42-8a0d-964103ee01cd" containerName="oc" Mar 16 00:20:00 crc kubenswrapper[4816]: I0316 00:20:00.146037 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="6cfda38e-dbdc-4b42-8a0d-964103ee01cd" containerName="oc" Mar 16 00:20:00 crc kubenswrapper[4816]: I0316 00:20:00.146224 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="6cfda38e-dbdc-4b42-8a0d-964103ee01cd" containerName="oc" Mar 16 00:20:00 crc kubenswrapper[4816]: I0316 00:20:00.146825 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29560340-pmlmw" Mar 16 00:20:00 crc kubenswrapper[4816]: I0316 00:20:00.150487 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 16 00:20:00 crc kubenswrapper[4816]: I0316 00:20:00.150522 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-8hc2r" Mar 16 00:20:00 crc kubenswrapper[4816]: I0316 00:20:00.152030 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 16 00:20:00 crc kubenswrapper[4816]: I0316 00:20:00.163685 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29560340-pmlmw"] Mar 16 00:20:00 crc kubenswrapper[4816]: I0316 00:20:00.172644 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-26lqg\" (UniqueName: \"kubernetes.io/projected/dc958138-2767-4d7a-8f61-bd16b899189f-kube-api-access-26lqg\") pod \"auto-csr-approver-29560340-pmlmw\" (UID: \"dc958138-2767-4d7a-8f61-bd16b899189f\") " pod="openshift-infra/auto-csr-approver-29560340-pmlmw" Mar 16 00:20:00 crc kubenswrapper[4816]: I0316 00:20:00.273462 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-26lqg\" (UniqueName: \"kubernetes.io/projected/dc958138-2767-4d7a-8f61-bd16b899189f-kube-api-access-26lqg\") pod \"auto-csr-approver-29560340-pmlmw\" (UID: \"dc958138-2767-4d7a-8f61-bd16b899189f\") " pod="openshift-infra/auto-csr-approver-29560340-pmlmw" Mar 16 00:20:00 crc kubenswrapper[4816]: I0316 00:20:00.300744 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-26lqg\" (UniqueName: \"kubernetes.io/projected/dc958138-2767-4d7a-8f61-bd16b899189f-kube-api-access-26lqg\") pod \"auto-csr-approver-29560340-pmlmw\" (UID: \"dc958138-2767-4d7a-8f61-bd16b899189f\") " pod="openshift-infra/auto-csr-approver-29560340-pmlmw" Mar 16 00:20:00 crc kubenswrapper[4816]: I0316 00:20:00.465857 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29560340-pmlmw" Mar 16 00:20:00 crc kubenswrapper[4816]: I0316 00:20:00.685441 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29560340-pmlmw"] Mar 16 00:20:01 crc kubenswrapper[4816]: I0316 00:20:01.316926 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29560340-pmlmw" event={"ID":"dc958138-2767-4d7a-8f61-bd16b899189f","Type":"ContainerStarted","Data":"f4329218a78829e970d0cd09947abdb25a1eb256ae427623608fcb446c86f8f3"} Mar 16 00:20:02 crc kubenswrapper[4816]: I0316 00:20:02.323166 4816 generic.go:334] "Generic (PLEG): container finished" podID="dc958138-2767-4d7a-8f61-bd16b899189f" containerID="4565949d11f1fa384d67b3420395f0c07c9d2ee22190f1a94b2e1bc9e4c10a96" exitCode=0 Mar 16 00:20:02 crc kubenswrapper[4816]: I0316 00:20:02.323387 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29560340-pmlmw" event={"ID":"dc958138-2767-4d7a-8f61-bd16b899189f","Type":"ContainerDied","Data":"4565949d11f1fa384d67b3420395f0c07c9d2ee22190f1a94b2e1bc9e4c10a96"} Mar 16 00:20:03 crc kubenswrapper[4816]: I0316 00:20:03.547323 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29560340-pmlmw" Mar 16 00:20:03 crc kubenswrapper[4816]: I0316 00:20:03.731836 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-26lqg\" (UniqueName: \"kubernetes.io/projected/dc958138-2767-4d7a-8f61-bd16b899189f-kube-api-access-26lqg\") pod \"dc958138-2767-4d7a-8f61-bd16b899189f\" (UID: \"dc958138-2767-4d7a-8f61-bd16b899189f\") " Mar 16 00:20:03 crc kubenswrapper[4816]: I0316 00:20:03.739736 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dc958138-2767-4d7a-8f61-bd16b899189f-kube-api-access-26lqg" (OuterVolumeSpecName: "kube-api-access-26lqg") pod "dc958138-2767-4d7a-8f61-bd16b899189f" (UID: "dc958138-2767-4d7a-8f61-bd16b899189f"). InnerVolumeSpecName "kube-api-access-26lqg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:20:03 crc kubenswrapper[4816]: I0316 00:20:03.833153 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-26lqg\" (UniqueName: \"kubernetes.io/projected/dc958138-2767-4d7a-8f61-bd16b899189f-kube-api-access-26lqg\") on node \"crc\" DevicePath \"\"" Mar 16 00:20:04 crc kubenswrapper[4816]: I0316 00:20:04.337450 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29560340-pmlmw" event={"ID":"dc958138-2767-4d7a-8f61-bd16b899189f","Type":"ContainerDied","Data":"f4329218a78829e970d0cd09947abdb25a1eb256ae427623608fcb446c86f8f3"} Mar 16 00:20:04 crc kubenswrapper[4816]: I0316 00:20:04.337913 4816 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f4329218a78829e970d0cd09947abdb25a1eb256ae427623608fcb446c86f8f3" Mar 16 00:20:04 crc kubenswrapper[4816]: I0316 00:20:04.337497 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29560340-pmlmw" Mar 16 00:20:04 crc kubenswrapper[4816]: I0316 00:20:04.607810 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29560334-7sx8j"] Mar 16 00:20:04 crc kubenswrapper[4816]: I0316 00:20:04.611201 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29560334-7sx8j"] Mar 16 00:20:05 crc kubenswrapper[4816]: I0316 00:20:05.679795 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5160d394-3d9b-4066-9bea-b9dd787b2a42" path="/var/lib/kubelet/pods/5160d394-3d9b-4066-9bea-b9dd787b2a42/volumes" Mar 16 00:20:11 crc kubenswrapper[4816]: I0316 00:20:11.784301 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-lslqp"] Mar 16 00:20:11 crc kubenswrapper[4816]: E0316 00:20:11.785808 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dc958138-2767-4d7a-8f61-bd16b899189f" containerName="oc" Mar 16 00:20:11 crc kubenswrapper[4816]: I0316 00:20:11.785911 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="dc958138-2767-4d7a-8f61-bd16b899189f" containerName="oc" Mar 16 00:20:11 crc kubenswrapper[4816]: I0316 00:20:11.786109 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="dc958138-2767-4d7a-8f61-bd16b899189f" containerName="oc" Mar 16 00:20:11 crc kubenswrapper[4816]: I0316 00:20:11.786678 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-lslqp" Mar 16 00:20:11 crc kubenswrapper[4816]: I0316 00:20:11.801523 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-lslqp"] Mar 16 00:20:11 crc kubenswrapper[4816]: I0316 00:20:11.968811 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/c7a56fa5-2504-4cbc-87c9-769b6c88b362-installation-pull-secrets\") pod \"image-registry-66df7c8f76-lslqp\" (UID: \"c7a56fa5-2504-4cbc-87c9-769b6c88b362\") " pod="openshift-image-registry/image-registry-66df7c8f76-lslqp" Mar 16 00:20:11 crc kubenswrapper[4816]: I0316 00:20:11.968898 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/c7a56fa5-2504-4cbc-87c9-769b6c88b362-bound-sa-token\") pod \"image-registry-66df7c8f76-lslqp\" (UID: \"c7a56fa5-2504-4cbc-87c9-769b6c88b362\") " pod="openshift-image-registry/image-registry-66df7c8f76-lslqp" Mar 16 00:20:11 crc kubenswrapper[4816]: I0316 00:20:11.968922 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/c7a56fa5-2504-4cbc-87c9-769b6c88b362-registry-certificates\") pod \"image-registry-66df7c8f76-lslqp\" (UID: \"c7a56fa5-2504-4cbc-87c9-769b6c88b362\") " pod="openshift-image-registry/image-registry-66df7c8f76-lslqp" Mar 16 00:20:11 crc kubenswrapper[4816]: I0316 00:20:11.968943 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c7a56fa5-2504-4cbc-87c9-769b6c88b362-trusted-ca\") pod \"image-registry-66df7c8f76-lslqp\" (UID: \"c7a56fa5-2504-4cbc-87c9-769b6c88b362\") " pod="openshift-image-registry/image-registry-66df7c8f76-lslqp" Mar 16 00:20:11 crc kubenswrapper[4816]: I0316 00:20:11.968989 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-lslqp\" (UID: \"c7a56fa5-2504-4cbc-87c9-769b6c88b362\") " pod="openshift-image-registry/image-registry-66df7c8f76-lslqp" Mar 16 00:20:11 crc kubenswrapper[4816]: I0316 00:20:11.969016 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/c7a56fa5-2504-4cbc-87c9-769b6c88b362-ca-trust-extracted\") pod \"image-registry-66df7c8f76-lslqp\" (UID: \"c7a56fa5-2504-4cbc-87c9-769b6c88b362\") " pod="openshift-image-registry/image-registry-66df7c8f76-lslqp" Mar 16 00:20:11 crc kubenswrapper[4816]: I0316 00:20:11.969041 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/c7a56fa5-2504-4cbc-87c9-769b6c88b362-registry-tls\") pod \"image-registry-66df7c8f76-lslqp\" (UID: \"c7a56fa5-2504-4cbc-87c9-769b6c88b362\") " pod="openshift-image-registry/image-registry-66df7c8f76-lslqp" Mar 16 00:20:11 crc kubenswrapper[4816]: I0316 00:20:11.969062 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jx29s\" (UniqueName: \"kubernetes.io/projected/c7a56fa5-2504-4cbc-87c9-769b6c88b362-kube-api-access-jx29s\") pod \"image-registry-66df7c8f76-lslqp\" (UID: \"c7a56fa5-2504-4cbc-87c9-769b6c88b362\") " pod="openshift-image-registry/image-registry-66df7c8f76-lslqp" Mar 16 00:20:11 crc kubenswrapper[4816]: I0316 00:20:11.992827 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-lslqp\" (UID: \"c7a56fa5-2504-4cbc-87c9-769b6c88b362\") " pod="openshift-image-registry/image-registry-66df7c8f76-lslqp" Mar 16 00:20:12 crc kubenswrapper[4816]: I0316 00:20:12.070318 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/c7a56fa5-2504-4cbc-87c9-769b6c88b362-installation-pull-secrets\") pod \"image-registry-66df7c8f76-lslqp\" (UID: \"c7a56fa5-2504-4cbc-87c9-769b6c88b362\") " pod="openshift-image-registry/image-registry-66df7c8f76-lslqp" Mar 16 00:20:12 crc kubenswrapper[4816]: I0316 00:20:12.070793 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/c7a56fa5-2504-4cbc-87c9-769b6c88b362-bound-sa-token\") pod \"image-registry-66df7c8f76-lslqp\" (UID: \"c7a56fa5-2504-4cbc-87c9-769b6c88b362\") " pod="openshift-image-registry/image-registry-66df7c8f76-lslqp" Mar 16 00:20:12 crc kubenswrapper[4816]: I0316 00:20:12.070921 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/c7a56fa5-2504-4cbc-87c9-769b6c88b362-registry-certificates\") pod \"image-registry-66df7c8f76-lslqp\" (UID: \"c7a56fa5-2504-4cbc-87c9-769b6c88b362\") " pod="openshift-image-registry/image-registry-66df7c8f76-lslqp" Mar 16 00:20:12 crc kubenswrapper[4816]: I0316 00:20:12.071050 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c7a56fa5-2504-4cbc-87c9-769b6c88b362-trusted-ca\") pod \"image-registry-66df7c8f76-lslqp\" (UID: \"c7a56fa5-2504-4cbc-87c9-769b6c88b362\") " pod="openshift-image-registry/image-registry-66df7c8f76-lslqp" Mar 16 00:20:12 crc kubenswrapper[4816]: I0316 00:20:12.071209 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/c7a56fa5-2504-4cbc-87c9-769b6c88b362-ca-trust-extracted\") pod \"image-registry-66df7c8f76-lslqp\" (UID: \"c7a56fa5-2504-4cbc-87c9-769b6c88b362\") " pod="openshift-image-registry/image-registry-66df7c8f76-lslqp" Mar 16 00:20:12 crc kubenswrapper[4816]: I0316 00:20:12.071327 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/c7a56fa5-2504-4cbc-87c9-769b6c88b362-registry-tls\") pod \"image-registry-66df7c8f76-lslqp\" (UID: \"c7a56fa5-2504-4cbc-87c9-769b6c88b362\") " pod="openshift-image-registry/image-registry-66df7c8f76-lslqp" Mar 16 00:20:12 crc kubenswrapper[4816]: I0316 00:20:12.071435 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jx29s\" (UniqueName: \"kubernetes.io/projected/c7a56fa5-2504-4cbc-87c9-769b6c88b362-kube-api-access-jx29s\") pod \"image-registry-66df7c8f76-lslqp\" (UID: \"c7a56fa5-2504-4cbc-87c9-769b6c88b362\") " pod="openshift-image-registry/image-registry-66df7c8f76-lslqp" Mar 16 00:20:12 crc kubenswrapper[4816]: I0316 00:20:12.071793 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/c7a56fa5-2504-4cbc-87c9-769b6c88b362-ca-trust-extracted\") pod \"image-registry-66df7c8f76-lslqp\" (UID: \"c7a56fa5-2504-4cbc-87c9-769b6c88b362\") " pod="openshift-image-registry/image-registry-66df7c8f76-lslqp" Mar 16 00:20:12 crc kubenswrapper[4816]: I0316 00:20:12.072601 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c7a56fa5-2504-4cbc-87c9-769b6c88b362-trusted-ca\") pod \"image-registry-66df7c8f76-lslqp\" (UID: \"c7a56fa5-2504-4cbc-87c9-769b6c88b362\") " pod="openshift-image-registry/image-registry-66df7c8f76-lslqp" Mar 16 00:20:12 crc kubenswrapper[4816]: I0316 00:20:12.072675 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/c7a56fa5-2504-4cbc-87c9-769b6c88b362-registry-certificates\") pod \"image-registry-66df7c8f76-lslqp\" (UID: \"c7a56fa5-2504-4cbc-87c9-769b6c88b362\") " pod="openshift-image-registry/image-registry-66df7c8f76-lslqp" Mar 16 00:20:12 crc kubenswrapper[4816]: I0316 00:20:12.075893 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/c7a56fa5-2504-4cbc-87c9-769b6c88b362-registry-tls\") pod \"image-registry-66df7c8f76-lslqp\" (UID: \"c7a56fa5-2504-4cbc-87c9-769b6c88b362\") " pod="openshift-image-registry/image-registry-66df7c8f76-lslqp" Mar 16 00:20:12 crc kubenswrapper[4816]: I0316 00:20:12.078878 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/c7a56fa5-2504-4cbc-87c9-769b6c88b362-installation-pull-secrets\") pod \"image-registry-66df7c8f76-lslqp\" (UID: \"c7a56fa5-2504-4cbc-87c9-769b6c88b362\") " pod="openshift-image-registry/image-registry-66df7c8f76-lslqp" Mar 16 00:20:12 crc kubenswrapper[4816]: I0316 00:20:12.091732 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jx29s\" (UniqueName: \"kubernetes.io/projected/c7a56fa5-2504-4cbc-87c9-769b6c88b362-kube-api-access-jx29s\") pod \"image-registry-66df7c8f76-lslqp\" (UID: \"c7a56fa5-2504-4cbc-87c9-769b6c88b362\") " pod="openshift-image-registry/image-registry-66df7c8f76-lslqp" Mar 16 00:20:12 crc kubenswrapper[4816]: I0316 00:20:12.093148 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/c7a56fa5-2504-4cbc-87c9-769b6c88b362-bound-sa-token\") pod \"image-registry-66df7c8f76-lslqp\" (UID: \"c7a56fa5-2504-4cbc-87c9-769b6c88b362\") " pod="openshift-image-registry/image-registry-66df7c8f76-lslqp" Mar 16 00:20:12 crc kubenswrapper[4816]: I0316 00:20:12.102226 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-lslqp" Mar 16 00:20:12 crc kubenswrapper[4816]: I0316 00:20:12.335338 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-lslqp"] Mar 16 00:20:12 crc kubenswrapper[4816]: I0316 00:20:12.392103 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-lslqp" event={"ID":"c7a56fa5-2504-4cbc-87c9-769b6c88b362","Type":"ContainerStarted","Data":"1b866e033d39f36c3f3be137aec614f6b7183066771a995c5186d2ace40ccf4a"} Mar 16 00:20:13 crc kubenswrapper[4816]: I0316 00:20:13.401148 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-lslqp" event={"ID":"c7a56fa5-2504-4cbc-87c9-769b6c88b362","Type":"ContainerStarted","Data":"e0fdd0c14a8a6704265bba7a35d3d797c6de6599bb0a9e1fee5998e2a4d29135"} Mar 16 00:20:13 crc kubenswrapper[4816]: I0316 00:20:13.401439 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-66df7c8f76-lslqp" Mar 16 00:20:13 crc kubenswrapper[4816]: I0316 00:20:13.431233 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-66df7c8f76-lslqp" podStartSLOduration=2.431208724 podStartE2EDuration="2.431208724s" podCreationTimestamp="2026-03-16 00:20:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-16 00:20:13.426302361 +0000 UTC m=+806.522602314" watchObservedRunningTime="2026-03-16 00:20:13.431208724 +0000 UTC m=+806.527508697" Mar 16 00:20:32 crc kubenswrapper[4816]: I0316 00:20:32.111498 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-66df7c8f76-lslqp" Mar 16 00:20:32 crc kubenswrapper[4816]: I0316 00:20:32.165702 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-ckvwn"] Mar 16 00:20:48 crc kubenswrapper[4816]: I0316 00:20:48.426661 4816 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Mar 16 00:20:57 crc kubenswrapper[4816]: I0316 00:20:57.212311 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-image-registry/image-registry-697d97f7c8-ckvwn" podUID="b155133b-d494-44bc-aa5d-23efc7cbd7a6" containerName="registry" containerID="cri-o://4318cca39cc5c9226ea995c847972dcda126bf000cf9cc35ad09a9cdabdc663b" gracePeriod=30 Mar 16 00:20:57 crc kubenswrapper[4816]: I0316 00:20:57.586589 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-ckvwn" Mar 16 00:20:57 crc kubenswrapper[4816]: I0316 00:20:57.678646 4816 generic.go:334] "Generic (PLEG): container finished" podID="b155133b-d494-44bc-aa5d-23efc7cbd7a6" containerID="4318cca39cc5c9226ea995c847972dcda126bf000cf9cc35ad09a9cdabdc663b" exitCode=0 Mar 16 00:20:57 crc kubenswrapper[4816]: I0316 00:20:57.678697 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-ckvwn" event={"ID":"b155133b-d494-44bc-aa5d-23efc7cbd7a6","Type":"ContainerDied","Data":"4318cca39cc5c9226ea995c847972dcda126bf000cf9cc35ad09a9cdabdc663b"} Mar 16 00:20:57 crc kubenswrapper[4816]: I0316 00:20:57.678724 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-ckvwn" event={"ID":"b155133b-d494-44bc-aa5d-23efc7cbd7a6","Type":"ContainerDied","Data":"e368502f9ca177437add127848813e2ad33e96c185b8ab726042b2878dcec995"} Mar 16 00:20:57 crc kubenswrapper[4816]: I0316 00:20:57.678745 4816 scope.go:117] "RemoveContainer" containerID="4318cca39cc5c9226ea995c847972dcda126bf000cf9cc35ad09a9cdabdc663b" Mar 16 00:20:57 crc kubenswrapper[4816]: I0316 00:20:57.678808 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-ckvwn" Mar 16 00:20:57 crc kubenswrapper[4816]: I0316 00:20:57.697032 4816 scope.go:117] "RemoveContainer" containerID="4318cca39cc5c9226ea995c847972dcda126bf000cf9cc35ad09a9cdabdc663b" Mar 16 00:20:57 crc kubenswrapper[4816]: E0316 00:20:57.697815 4816 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4318cca39cc5c9226ea995c847972dcda126bf000cf9cc35ad09a9cdabdc663b\": container with ID starting with 4318cca39cc5c9226ea995c847972dcda126bf000cf9cc35ad09a9cdabdc663b not found: ID does not exist" containerID="4318cca39cc5c9226ea995c847972dcda126bf000cf9cc35ad09a9cdabdc663b" Mar 16 00:20:57 crc kubenswrapper[4816]: I0316 00:20:57.697892 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4318cca39cc5c9226ea995c847972dcda126bf000cf9cc35ad09a9cdabdc663b"} err="failed to get container status \"4318cca39cc5c9226ea995c847972dcda126bf000cf9cc35ad09a9cdabdc663b\": rpc error: code = NotFound desc = could not find container \"4318cca39cc5c9226ea995c847972dcda126bf000cf9cc35ad09a9cdabdc663b\": container with ID starting with 4318cca39cc5c9226ea995c847972dcda126bf000cf9cc35ad09a9cdabdc663b not found: ID does not exist" Mar 16 00:20:57 crc kubenswrapper[4816]: I0316 00:20:57.739125 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9r7jk\" (UniqueName: \"kubernetes.io/projected/b155133b-d494-44bc-aa5d-23efc7cbd7a6-kube-api-access-9r7jk\") pod \"b155133b-d494-44bc-aa5d-23efc7cbd7a6\" (UID: \"b155133b-d494-44bc-aa5d-23efc7cbd7a6\") " Mar 16 00:20:57 crc kubenswrapper[4816]: I0316 00:20:57.739199 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/b155133b-d494-44bc-aa5d-23efc7cbd7a6-installation-pull-secrets\") pod \"b155133b-d494-44bc-aa5d-23efc7cbd7a6\" (UID: \"b155133b-d494-44bc-aa5d-23efc7cbd7a6\") " Mar 16 00:20:57 crc kubenswrapper[4816]: I0316 00:20:57.739230 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/b155133b-d494-44bc-aa5d-23efc7cbd7a6-ca-trust-extracted\") pod \"b155133b-d494-44bc-aa5d-23efc7cbd7a6\" (UID: \"b155133b-d494-44bc-aa5d-23efc7cbd7a6\") " Mar 16 00:20:57 crc kubenswrapper[4816]: I0316 00:20:57.739285 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/b155133b-d494-44bc-aa5d-23efc7cbd7a6-registry-certificates\") pod \"b155133b-d494-44bc-aa5d-23efc7cbd7a6\" (UID: \"b155133b-d494-44bc-aa5d-23efc7cbd7a6\") " Mar 16 00:20:57 crc kubenswrapper[4816]: I0316 00:20:57.739321 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/b155133b-d494-44bc-aa5d-23efc7cbd7a6-registry-tls\") pod \"b155133b-d494-44bc-aa5d-23efc7cbd7a6\" (UID: \"b155133b-d494-44bc-aa5d-23efc7cbd7a6\") " Mar 16 00:20:57 crc kubenswrapper[4816]: I0316 00:20:57.739360 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/b155133b-d494-44bc-aa5d-23efc7cbd7a6-bound-sa-token\") pod \"b155133b-d494-44bc-aa5d-23efc7cbd7a6\" (UID: \"b155133b-d494-44bc-aa5d-23efc7cbd7a6\") " Mar 16 00:20:57 crc kubenswrapper[4816]: I0316 00:20:57.739404 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b155133b-d494-44bc-aa5d-23efc7cbd7a6-trusted-ca\") pod \"b155133b-d494-44bc-aa5d-23efc7cbd7a6\" (UID: \"b155133b-d494-44bc-aa5d-23efc7cbd7a6\") " Mar 16 00:20:57 crc kubenswrapper[4816]: I0316 00:20:57.739615 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-storage\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"b155133b-d494-44bc-aa5d-23efc7cbd7a6\" (UID: \"b155133b-d494-44bc-aa5d-23efc7cbd7a6\") " Mar 16 00:20:57 crc kubenswrapper[4816]: I0316 00:20:57.742205 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b155133b-d494-44bc-aa5d-23efc7cbd7a6-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "b155133b-d494-44bc-aa5d-23efc7cbd7a6" (UID: "b155133b-d494-44bc-aa5d-23efc7cbd7a6"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:20:57 crc kubenswrapper[4816]: I0316 00:20:57.742758 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b155133b-d494-44bc-aa5d-23efc7cbd7a6-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "b155133b-d494-44bc-aa5d-23efc7cbd7a6" (UID: "b155133b-d494-44bc-aa5d-23efc7cbd7a6"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:20:57 crc kubenswrapper[4816]: I0316 00:20:57.747486 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b155133b-d494-44bc-aa5d-23efc7cbd7a6-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "b155133b-d494-44bc-aa5d-23efc7cbd7a6" (UID: "b155133b-d494-44bc-aa5d-23efc7cbd7a6"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:20:57 crc kubenswrapper[4816]: I0316 00:20:57.751272 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b155133b-d494-44bc-aa5d-23efc7cbd7a6-kube-api-access-9r7jk" (OuterVolumeSpecName: "kube-api-access-9r7jk") pod "b155133b-d494-44bc-aa5d-23efc7cbd7a6" (UID: "b155133b-d494-44bc-aa5d-23efc7cbd7a6"). InnerVolumeSpecName "kube-api-access-9r7jk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:20:57 crc kubenswrapper[4816]: I0316 00:20:57.751763 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b155133b-d494-44bc-aa5d-23efc7cbd7a6-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "b155133b-d494-44bc-aa5d-23efc7cbd7a6" (UID: "b155133b-d494-44bc-aa5d-23efc7cbd7a6"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 16 00:20:57 crc kubenswrapper[4816]: I0316 00:20:57.751997 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b155133b-d494-44bc-aa5d-23efc7cbd7a6-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "b155133b-d494-44bc-aa5d-23efc7cbd7a6" (UID: "b155133b-d494-44bc-aa5d-23efc7cbd7a6"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:20:57 crc kubenswrapper[4816]: I0316 00:20:57.757482 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b155133b-d494-44bc-aa5d-23efc7cbd7a6-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "b155133b-d494-44bc-aa5d-23efc7cbd7a6" (UID: "b155133b-d494-44bc-aa5d-23efc7cbd7a6"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 16 00:20:57 crc kubenswrapper[4816]: I0316 00:20:57.761123 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "registry-storage") pod "b155133b-d494-44bc-aa5d-23efc7cbd7a6" (UID: "b155133b-d494-44bc-aa5d-23efc7cbd7a6"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Mar 16 00:20:57 crc kubenswrapper[4816]: I0316 00:20:57.841140 4816 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/b155133b-d494-44bc-aa5d-23efc7cbd7a6-bound-sa-token\") on node \"crc\" DevicePath \"\"" Mar 16 00:20:57 crc kubenswrapper[4816]: I0316 00:20:57.841197 4816 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b155133b-d494-44bc-aa5d-23efc7cbd7a6-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 16 00:20:57 crc kubenswrapper[4816]: I0316 00:20:57.841210 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9r7jk\" (UniqueName: \"kubernetes.io/projected/b155133b-d494-44bc-aa5d-23efc7cbd7a6-kube-api-access-9r7jk\") on node \"crc\" DevicePath \"\"" Mar 16 00:20:57 crc kubenswrapper[4816]: I0316 00:20:57.841229 4816 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/b155133b-d494-44bc-aa5d-23efc7cbd7a6-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Mar 16 00:20:57 crc kubenswrapper[4816]: I0316 00:20:57.841250 4816 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/b155133b-d494-44bc-aa5d-23efc7cbd7a6-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Mar 16 00:20:57 crc kubenswrapper[4816]: I0316 00:20:57.841336 4816 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/b155133b-d494-44bc-aa5d-23efc7cbd7a6-registry-certificates\") on node \"crc\" DevicePath \"\"" Mar 16 00:20:57 crc kubenswrapper[4816]: I0316 00:20:57.841352 4816 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/b155133b-d494-44bc-aa5d-23efc7cbd7a6-registry-tls\") on node \"crc\" DevicePath \"\"" Mar 16 00:20:58 crc kubenswrapper[4816]: I0316 00:20:58.030206 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-ckvwn"] Mar 16 00:20:58 crc kubenswrapper[4816]: I0316 00:20:58.042764 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-ckvwn"] Mar 16 00:20:59 crc kubenswrapper[4816]: I0316 00:20:59.677347 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b155133b-d494-44bc-aa5d-23efc7cbd7a6" path="/var/lib/kubelet/pods/b155133b-d494-44bc-aa5d-23efc7cbd7a6/volumes" Mar 16 00:21:00 crc kubenswrapper[4816]: I0316 00:21:00.399487 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-psjs7"] Mar 16 00:21:00 crc kubenswrapper[4816]: I0316 00:21:00.400443 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-psjs7" podUID="2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88" containerName="ovn-controller" containerID="cri-o://0fac0a986edf984b0e34eda0f969205d5bc92f23d0edacb3c6dd8721eaed2fb9" gracePeriod=30 Mar 16 00:21:00 crc kubenswrapper[4816]: I0316 00:21:00.400540 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-psjs7" podUID="2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88" containerName="nbdb" containerID="cri-o://826495d8ab8f414169bf36159278da0a040fec699a74b4aebdc8f87ccdc48bd6" gracePeriod=30 Mar 16 00:21:00 crc kubenswrapper[4816]: I0316 00:21:00.400613 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-psjs7" podUID="2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88" containerName="ovn-acl-logging" containerID="cri-o://86c28f1ae2f2fb61b7755ac14bc7f5346036735c75daccaa0e31b4e606cfb4c2" gracePeriod=30 Mar 16 00:21:00 crc kubenswrapper[4816]: I0316 00:21:00.400598 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-psjs7" podUID="2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88" containerName="kube-rbac-proxy-ovn-metrics" containerID="cri-o://4e0a510814d60106574e3cd3e612c031ac900caa0f24a12a62d7cd1f6bfda1ed" gracePeriod=30 Mar 16 00:21:00 crc kubenswrapper[4816]: I0316 00:21:00.400782 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-psjs7" podUID="2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88" containerName="northd" containerID="cri-o://aa5f052d32ad8f52f33ee5baea5af98a64f05b831b86ebced6c9422fb4845fcf" gracePeriod=30 Mar 16 00:21:00 crc kubenswrapper[4816]: I0316 00:21:00.400816 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-psjs7" podUID="2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88" containerName="sbdb" containerID="cri-o://4d6bf581280690c99ddbbecd4cf4e215a12230d377978aa48acd3c05b85a3c56" gracePeriod=30 Mar 16 00:21:00 crc kubenswrapper[4816]: I0316 00:21:00.400589 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-psjs7" podUID="2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88" containerName="kube-rbac-proxy-node" containerID="cri-o://f47b6c54d5d72d827208f4d8228be260d9d19b4a720b3174349288dee2916641" gracePeriod=30 Mar 16 00:21:00 crc kubenswrapper[4816]: I0316 00:21:00.449574 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-psjs7" podUID="2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88" containerName="ovnkube-controller" containerID="cri-o://7e5d87dc1889484bb2175c0613eda0b852c65a289a1c165f6adae2a822892aa2" gracePeriod=30 Mar 16 00:21:00 crc kubenswrapper[4816]: I0316 00:21:00.707775 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-szscw_e9789e58-12c8-4831-9401-af48a3e92209/kube-multus/2.log" Mar 16 00:21:00 crc kubenswrapper[4816]: I0316 00:21:00.708303 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-szscw_e9789e58-12c8-4831-9401-af48a3e92209/kube-multus/1.log" Mar 16 00:21:00 crc kubenswrapper[4816]: I0316 00:21:00.708361 4816 generic.go:334] "Generic (PLEG): container finished" podID="e9789e58-12c8-4831-9401-af48a3e92209" containerID="707ec2df051aa6206ac2bc1c4db6b5fe6b37467b90b6ee42dbf28f2b88e5d6e6" exitCode=2 Mar 16 00:21:00 crc kubenswrapper[4816]: I0316 00:21:00.708465 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-szscw" event={"ID":"e9789e58-12c8-4831-9401-af48a3e92209","Type":"ContainerDied","Data":"707ec2df051aa6206ac2bc1c4db6b5fe6b37467b90b6ee42dbf28f2b88e5d6e6"} Mar 16 00:21:00 crc kubenswrapper[4816]: I0316 00:21:00.708508 4816 scope.go:117] "RemoveContainer" containerID="b45d6058e72f7117fdfb86b3480de77c8592dba7dcd7932b2a5c34036da7af26" Mar 16 00:21:00 crc kubenswrapper[4816]: I0316 00:21:00.709152 4816 scope.go:117] "RemoveContainer" containerID="707ec2df051aa6206ac2bc1c4db6b5fe6b37467b90b6ee42dbf28f2b88e5d6e6" Mar 16 00:21:00 crc kubenswrapper[4816]: I0316 00:21:00.713214 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-psjs7_2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88/ovnkube-controller/3.log" Mar 16 00:21:00 crc kubenswrapper[4816]: I0316 00:21:00.716289 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-psjs7_2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88/ovn-acl-logging/0.log" Mar 16 00:21:00 crc kubenswrapper[4816]: I0316 00:21:00.717141 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-psjs7_2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88/ovn-controller/0.log" Mar 16 00:21:00 crc kubenswrapper[4816]: I0316 00:21:00.718415 4816 generic.go:334] "Generic (PLEG): container finished" podID="2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88" containerID="7e5d87dc1889484bb2175c0613eda0b852c65a289a1c165f6adae2a822892aa2" exitCode=0 Mar 16 00:21:00 crc kubenswrapper[4816]: I0316 00:21:00.718455 4816 generic.go:334] "Generic (PLEG): container finished" podID="2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88" containerID="4d6bf581280690c99ddbbecd4cf4e215a12230d377978aa48acd3c05b85a3c56" exitCode=0 Mar 16 00:21:00 crc kubenswrapper[4816]: I0316 00:21:00.718463 4816 generic.go:334] "Generic (PLEG): container finished" podID="2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88" containerID="826495d8ab8f414169bf36159278da0a040fec699a74b4aebdc8f87ccdc48bd6" exitCode=0 Mar 16 00:21:00 crc kubenswrapper[4816]: I0316 00:21:00.718470 4816 generic.go:334] "Generic (PLEG): container finished" podID="2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88" containerID="aa5f052d32ad8f52f33ee5baea5af98a64f05b831b86ebced6c9422fb4845fcf" exitCode=0 Mar 16 00:21:00 crc kubenswrapper[4816]: I0316 00:21:00.718477 4816 generic.go:334] "Generic (PLEG): container finished" podID="2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88" containerID="4e0a510814d60106574e3cd3e612c031ac900caa0f24a12a62d7cd1f6bfda1ed" exitCode=0 Mar 16 00:21:00 crc kubenswrapper[4816]: I0316 00:21:00.718483 4816 generic.go:334] "Generic (PLEG): container finished" podID="2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88" containerID="f47b6c54d5d72d827208f4d8228be260d9d19b4a720b3174349288dee2916641" exitCode=0 Mar 16 00:21:00 crc kubenswrapper[4816]: I0316 00:21:00.718489 4816 generic.go:334] "Generic (PLEG): container finished" podID="2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88" containerID="86c28f1ae2f2fb61b7755ac14bc7f5346036735c75daccaa0e31b4e606cfb4c2" exitCode=143 Mar 16 00:21:00 crc kubenswrapper[4816]: I0316 00:21:00.718496 4816 generic.go:334] "Generic (PLEG): container finished" podID="2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88" containerID="0fac0a986edf984b0e34eda0f969205d5bc92f23d0edacb3c6dd8721eaed2fb9" exitCode=143 Mar 16 00:21:00 crc kubenswrapper[4816]: I0316 00:21:00.718506 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-psjs7" event={"ID":"2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88","Type":"ContainerDied","Data":"7e5d87dc1889484bb2175c0613eda0b852c65a289a1c165f6adae2a822892aa2"} Mar 16 00:21:00 crc kubenswrapper[4816]: I0316 00:21:00.718581 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-psjs7" event={"ID":"2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88","Type":"ContainerDied","Data":"4d6bf581280690c99ddbbecd4cf4e215a12230d377978aa48acd3c05b85a3c56"} Mar 16 00:21:00 crc kubenswrapper[4816]: I0316 00:21:00.718600 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-psjs7" event={"ID":"2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88","Type":"ContainerDied","Data":"826495d8ab8f414169bf36159278da0a040fec699a74b4aebdc8f87ccdc48bd6"} Mar 16 00:21:00 crc kubenswrapper[4816]: I0316 00:21:00.718616 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-psjs7" event={"ID":"2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88","Type":"ContainerDied","Data":"aa5f052d32ad8f52f33ee5baea5af98a64f05b831b86ebced6c9422fb4845fcf"} Mar 16 00:21:00 crc kubenswrapper[4816]: I0316 00:21:00.718631 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-psjs7" event={"ID":"2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88","Type":"ContainerDied","Data":"4e0a510814d60106574e3cd3e612c031ac900caa0f24a12a62d7cd1f6bfda1ed"} Mar 16 00:21:00 crc kubenswrapper[4816]: I0316 00:21:00.718645 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-psjs7" event={"ID":"2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88","Type":"ContainerDied","Data":"f47b6c54d5d72d827208f4d8228be260d9d19b4a720b3174349288dee2916641"} Mar 16 00:21:00 crc kubenswrapper[4816]: I0316 00:21:00.718660 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-psjs7" event={"ID":"2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88","Type":"ContainerDied","Data":"86c28f1ae2f2fb61b7755ac14bc7f5346036735c75daccaa0e31b4e606cfb4c2"} Mar 16 00:21:00 crc kubenswrapper[4816]: I0316 00:21:00.718675 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-psjs7" event={"ID":"2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88","Type":"ContainerDied","Data":"0fac0a986edf984b0e34eda0f969205d5bc92f23d0edacb3c6dd8721eaed2fb9"} Mar 16 00:21:00 crc kubenswrapper[4816]: I0316 00:21:00.718688 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-psjs7" event={"ID":"2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88","Type":"ContainerDied","Data":"5a5bb9275c0f8cb8f5457ed5c2f6ecf42790ebe9298a9783a1a55a1c78e14761"} Mar 16 00:21:00 crc kubenswrapper[4816]: I0316 00:21:00.718702 4816 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5a5bb9275c0f8cb8f5457ed5c2f6ecf42790ebe9298a9783a1a55a1c78e14761" Mar 16 00:21:00 crc kubenswrapper[4816]: I0316 00:21:00.740232 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-psjs7_2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88/ovnkube-controller/3.log" Mar 16 00:21:00 crc kubenswrapper[4816]: I0316 00:21:00.744411 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-psjs7_2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88/ovn-acl-logging/0.log" Mar 16 00:21:00 crc kubenswrapper[4816]: I0316 00:21:00.745070 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-psjs7_2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88/ovn-controller/0.log" Mar 16 00:21:00 crc kubenswrapper[4816]: I0316 00:21:00.745701 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-psjs7" Mar 16 00:21:00 crc kubenswrapper[4816]: I0316 00:21:00.746434 4816 scope.go:117] "RemoveContainer" containerID="ba7195b779774df29cc722244d9b4290e7f75033f6e135e41288fbb5310a7c3c" Mar 16 00:21:00 crc kubenswrapper[4816]: I0316 00:21:00.806406 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-8h7qh"] Mar 16 00:21:00 crc kubenswrapper[4816]: E0316 00:21:00.806685 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88" containerName="ovnkube-controller" Mar 16 00:21:00 crc kubenswrapper[4816]: I0316 00:21:00.806710 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88" containerName="ovnkube-controller" Mar 16 00:21:00 crc kubenswrapper[4816]: E0316 00:21:00.806723 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88" containerName="ovn-acl-logging" Mar 16 00:21:00 crc kubenswrapper[4816]: I0316 00:21:00.806733 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88" containerName="ovn-acl-logging" Mar 16 00:21:00 crc kubenswrapper[4816]: E0316 00:21:00.806749 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88" containerName="kube-rbac-proxy-node" Mar 16 00:21:00 crc kubenswrapper[4816]: I0316 00:21:00.806758 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88" containerName="kube-rbac-proxy-node" Mar 16 00:21:00 crc kubenswrapper[4816]: E0316 00:21:00.806769 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88" containerName="sbdb" Mar 16 00:21:00 crc kubenswrapper[4816]: I0316 00:21:00.806777 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88" containerName="sbdb" Mar 16 00:21:00 crc kubenswrapper[4816]: E0316 00:21:00.806788 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88" containerName="northd" Mar 16 00:21:00 crc kubenswrapper[4816]: I0316 00:21:00.806795 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88" containerName="northd" Mar 16 00:21:00 crc kubenswrapper[4816]: E0316 00:21:00.806808 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88" containerName="ovnkube-controller" Mar 16 00:21:00 crc kubenswrapper[4816]: I0316 00:21:00.806817 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88" containerName="ovnkube-controller" Mar 16 00:21:00 crc kubenswrapper[4816]: E0316 00:21:00.806828 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88" containerName="ovn-controller" Mar 16 00:21:00 crc kubenswrapper[4816]: I0316 00:21:00.806837 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88" containerName="ovn-controller" Mar 16 00:21:00 crc kubenswrapper[4816]: E0316 00:21:00.806852 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b155133b-d494-44bc-aa5d-23efc7cbd7a6" containerName="registry" Mar 16 00:21:00 crc kubenswrapper[4816]: I0316 00:21:00.806861 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="b155133b-d494-44bc-aa5d-23efc7cbd7a6" containerName="registry" Mar 16 00:21:00 crc kubenswrapper[4816]: E0316 00:21:00.806873 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88" containerName="nbdb" Mar 16 00:21:00 crc kubenswrapper[4816]: I0316 00:21:00.806881 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88" containerName="nbdb" Mar 16 00:21:00 crc kubenswrapper[4816]: E0316 00:21:00.806893 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88" containerName="kubecfg-setup" Mar 16 00:21:00 crc kubenswrapper[4816]: I0316 00:21:00.806903 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88" containerName="kubecfg-setup" Mar 16 00:21:00 crc kubenswrapper[4816]: E0316 00:21:00.806914 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88" containerName="ovnkube-controller" Mar 16 00:21:00 crc kubenswrapper[4816]: I0316 00:21:00.806922 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88" containerName="ovnkube-controller" Mar 16 00:21:00 crc kubenswrapper[4816]: E0316 00:21:00.806932 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88" containerName="kube-rbac-proxy-ovn-metrics" Mar 16 00:21:00 crc kubenswrapper[4816]: I0316 00:21:00.806940 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88" containerName="kube-rbac-proxy-ovn-metrics" Mar 16 00:21:00 crc kubenswrapper[4816]: E0316 00:21:00.806955 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88" containerName="ovnkube-controller" Mar 16 00:21:00 crc kubenswrapper[4816]: I0316 00:21:00.806964 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88" containerName="ovnkube-controller" Mar 16 00:21:00 crc kubenswrapper[4816]: I0316 00:21:00.807085 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88" containerName="northd" Mar 16 00:21:00 crc kubenswrapper[4816]: I0316 00:21:00.807100 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88" containerName="kube-rbac-proxy-node" Mar 16 00:21:00 crc kubenswrapper[4816]: I0316 00:21:00.807110 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88" containerName="ovnkube-controller" Mar 16 00:21:00 crc kubenswrapper[4816]: I0316 00:21:00.807119 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88" containerName="kube-rbac-proxy-ovn-metrics" Mar 16 00:21:00 crc kubenswrapper[4816]: I0316 00:21:00.807129 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88" containerName="ovnkube-controller" Mar 16 00:21:00 crc kubenswrapper[4816]: I0316 00:21:00.807138 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88" containerName="sbdb" Mar 16 00:21:00 crc kubenswrapper[4816]: I0316 00:21:00.807150 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88" containerName="ovnkube-controller" Mar 16 00:21:00 crc kubenswrapper[4816]: I0316 00:21:00.807162 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88" containerName="nbdb" Mar 16 00:21:00 crc kubenswrapper[4816]: I0316 00:21:00.807169 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88" containerName="ovn-controller" Mar 16 00:21:00 crc kubenswrapper[4816]: I0316 00:21:00.807182 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88" containerName="ovn-acl-logging" Mar 16 00:21:00 crc kubenswrapper[4816]: I0316 00:21:00.807194 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="b155133b-d494-44bc-aa5d-23efc7cbd7a6" containerName="registry" Mar 16 00:21:00 crc kubenswrapper[4816]: E0316 00:21:00.807323 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88" containerName="ovnkube-controller" Mar 16 00:21:00 crc kubenswrapper[4816]: I0316 00:21:00.807334 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88" containerName="ovnkube-controller" Mar 16 00:21:00 crc kubenswrapper[4816]: I0316 00:21:00.807444 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88" containerName="ovnkube-controller" Mar 16 00:21:00 crc kubenswrapper[4816]: I0316 00:21:00.807457 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88" containerName="ovnkube-controller" Mar 16 00:21:00 crc kubenswrapper[4816]: I0316 00:21:00.809537 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-8h7qh" Mar 16 00:21:00 crc kubenswrapper[4816]: I0316 00:21:00.884031 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88-host-var-lib-cni-networks-ovn-kubernetes\") pod \"2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88\" (UID: \"2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88\") " Mar 16 00:21:00 crc kubenswrapper[4816]: I0316 00:21:00.884098 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88-ovnkube-config\") pod \"2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88\" (UID: \"2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88\") " Mar 16 00:21:00 crc kubenswrapper[4816]: I0316 00:21:00.884133 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88-host-cni-bin\") pod \"2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88\" (UID: \"2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88\") " Mar 16 00:21:00 crc kubenswrapper[4816]: I0316 00:21:00.884163 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9rd68\" (UniqueName: \"kubernetes.io/projected/2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88-kube-api-access-9rd68\") pod \"2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88\" (UID: \"2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88\") " Mar 16 00:21:00 crc kubenswrapper[4816]: I0316 00:21:00.884187 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88-host-kubelet\") pod \"2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88\" (UID: \"2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88\") " Mar 16 00:21:00 crc kubenswrapper[4816]: I0316 00:21:00.884207 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88-ovnkube-script-lib\") pod \"2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88\" (UID: \"2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88\") " Mar 16 00:21:00 crc kubenswrapper[4816]: I0316 00:21:00.884250 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88-systemd-units\") pod \"2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88\" (UID: \"2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88\") " Mar 16 00:21:00 crc kubenswrapper[4816]: I0316 00:21:00.884270 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88-host-run-ovn-kubernetes\") pod \"2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88\" (UID: \"2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88\") " Mar 16 00:21:00 crc kubenswrapper[4816]: I0316 00:21:00.884290 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88-log-socket\") pod \"2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88\" (UID: \"2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88\") " Mar 16 00:21:00 crc kubenswrapper[4816]: I0316 00:21:00.884312 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88-run-ovn\") pod \"2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88\" (UID: \"2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88\") " Mar 16 00:21:00 crc kubenswrapper[4816]: I0316 00:21:00.884336 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88-host-cni-netd\") pod \"2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88\" (UID: \"2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88\") " Mar 16 00:21:00 crc kubenswrapper[4816]: I0316 00:21:00.884366 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88-ovn-node-metrics-cert\") pod \"2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88\" (UID: \"2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88\") " Mar 16 00:21:00 crc kubenswrapper[4816]: I0316 00:21:00.884393 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88-run-openvswitch\") pod \"2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88\" (UID: \"2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88\") " Mar 16 00:21:00 crc kubenswrapper[4816]: I0316 00:21:00.884418 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88-env-overrides\") pod \"2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88\" (UID: \"2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88\") " Mar 16 00:21:00 crc kubenswrapper[4816]: I0316 00:21:00.884440 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88-node-log\") pod \"2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88\" (UID: \"2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88\") " Mar 16 00:21:00 crc kubenswrapper[4816]: I0316 00:21:00.884484 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88-host-slash\") pod \"2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88\" (UID: \"2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88\") " Mar 16 00:21:00 crc kubenswrapper[4816]: I0316 00:21:00.884507 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88-run-systemd\") pod \"2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88\" (UID: \"2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88\") " Mar 16 00:21:00 crc kubenswrapper[4816]: I0316 00:21:00.884573 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88-var-lib-openvswitch\") pod \"2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88\" (UID: \"2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88\") " Mar 16 00:21:00 crc kubenswrapper[4816]: I0316 00:21:00.884596 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88-host-run-netns\") pod \"2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88\" (UID: \"2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88\") " Mar 16 00:21:00 crc kubenswrapper[4816]: I0316 00:21:00.884617 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88-etc-openvswitch\") pod \"2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88\" (UID: \"2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88\") " Mar 16 00:21:00 crc kubenswrapper[4816]: I0316 00:21:00.885248 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88-host-var-lib-cni-networks-ovn-kubernetes" (OuterVolumeSpecName: "host-var-lib-cni-networks-ovn-kubernetes") pod "2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88" (UID: "2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88"). InnerVolumeSpecName "host-var-lib-cni-networks-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 16 00:21:00 crc kubenswrapper[4816]: I0316 00:21:00.885753 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88" (UID: "2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:21:00 crc kubenswrapper[4816]: I0316 00:21:00.885793 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88-host-cni-bin" (OuterVolumeSpecName: "host-cni-bin") pod "2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88" (UID: "2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88"). InnerVolumeSpecName "host-cni-bin". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 16 00:21:00 crc kubenswrapper[4816]: I0316 00:21:00.887971 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88-host-kubelet" (OuterVolumeSpecName: "host-kubelet") pod "2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88" (UID: "2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88"). InnerVolumeSpecName "host-kubelet". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 16 00:21:00 crc kubenswrapper[4816]: I0316 00:21:00.888667 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88-var-lib-openvswitch" (OuterVolumeSpecName: "var-lib-openvswitch") pod "2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88" (UID: "2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88"). InnerVolumeSpecName "var-lib-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 16 00:21:00 crc kubenswrapper[4816]: I0316 00:21:00.888682 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88-etc-openvswitch" (OuterVolumeSpecName: "etc-openvswitch") pod "2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88" (UID: "2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88"). InnerVolumeSpecName "etc-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 16 00:21:00 crc kubenswrapper[4816]: I0316 00:21:00.888723 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88-host-run-netns" (OuterVolumeSpecName: "host-run-netns") pod "2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88" (UID: "2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88"). InnerVolumeSpecName "host-run-netns". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 16 00:21:00 crc kubenswrapper[4816]: I0316 00:21:00.888730 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88-host-slash" (OuterVolumeSpecName: "host-slash") pod "2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88" (UID: "2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88"). InnerVolumeSpecName "host-slash". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 16 00:21:00 crc kubenswrapper[4816]: I0316 00:21:00.888853 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88-run-openvswitch" (OuterVolumeSpecName: "run-openvswitch") pod "2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88" (UID: "2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88"). InnerVolumeSpecName "run-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 16 00:21:00 crc kubenswrapper[4816]: I0316 00:21:00.888874 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88-host-run-ovn-kubernetes" (OuterVolumeSpecName: "host-run-ovn-kubernetes") pod "2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88" (UID: "2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88"). InnerVolumeSpecName "host-run-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 16 00:21:00 crc kubenswrapper[4816]: I0316 00:21:00.888897 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88-log-socket" (OuterVolumeSpecName: "log-socket") pod "2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88" (UID: "2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88"). InnerVolumeSpecName "log-socket". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 16 00:21:00 crc kubenswrapper[4816]: I0316 00:21:00.888878 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88-node-log" (OuterVolumeSpecName: "node-log") pod "2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88" (UID: "2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88"). InnerVolumeSpecName "node-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 16 00:21:00 crc kubenswrapper[4816]: I0316 00:21:00.888918 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88-host-cni-netd" (OuterVolumeSpecName: "host-cni-netd") pod "2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88" (UID: "2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88"). InnerVolumeSpecName "host-cni-netd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 16 00:21:00 crc kubenswrapper[4816]: I0316 00:21:00.888949 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88-systemd-units" (OuterVolumeSpecName: "systemd-units") pod "2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88" (UID: "2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88"). InnerVolumeSpecName "systemd-units". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 16 00:21:00 crc kubenswrapper[4816]: I0316 00:21:00.889028 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88-run-ovn" (OuterVolumeSpecName: "run-ovn") pod "2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88" (UID: "2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88"). InnerVolumeSpecName "run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 16 00:21:00 crc kubenswrapper[4816]: I0316 00:21:00.889182 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88" (UID: "2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:21:00 crc kubenswrapper[4816]: I0316 00:21:00.889279 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88" (UID: "2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:21:00 crc kubenswrapper[4816]: I0316 00:21:00.891928 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88-kube-api-access-9rd68" (OuterVolumeSpecName: "kube-api-access-9rd68") pod "2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88" (UID: "2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88"). InnerVolumeSpecName "kube-api-access-9rd68". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:21:00 crc kubenswrapper[4816]: I0316 00:21:00.894215 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88" (UID: "2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 16 00:21:00 crc kubenswrapper[4816]: I0316 00:21:00.911425 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88-run-systemd" (OuterVolumeSpecName: "run-systemd") pod "2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88" (UID: "2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88"). InnerVolumeSpecName "run-systemd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 16 00:21:00 crc kubenswrapper[4816]: I0316 00:21:00.985903 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/842bb112-f402-4717-bc56-f488fc3c5db7-systemd-units\") pod \"ovnkube-node-8h7qh\" (UID: \"842bb112-f402-4717-bc56-f488fc3c5db7\") " pod="openshift-ovn-kubernetes/ovnkube-node-8h7qh" Mar 16 00:21:00 crc kubenswrapper[4816]: I0316 00:21:00.986049 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/842bb112-f402-4717-bc56-f488fc3c5db7-env-overrides\") pod \"ovnkube-node-8h7qh\" (UID: \"842bb112-f402-4717-bc56-f488fc3c5db7\") " pod="openshift-ovn-kubernetes/ovnkube-node-8h7qh" Mar 16 00:21:00 crc kubenswrapper[4816]: I0316 00:21:00.986134 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/842bb112-f402-4717-bc56-f488fc3c5db7-var-lib-openvswitch\") pod \"ovnkube-node-8h7qh\" (UID: \"842bb112-f402-4717-bc56-f488fc3c5db7\") " pod="openshift-ovn-kubernetes/ovnkube-node-8h7qh" Mar 16 00:21:00 crc kubenswrapper[4816]: I0316 00:21:00.986185 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/842bb112-f402-4717-bc56-f488fc3c5db7-log-socket\") pod \"ovnkube-node-8h7qh\" (UID: \"842bb112-f402-4717-bc56-f488fc3c5db7\") " pod="openshift-ovn-kubernetes/ovnkube-node-8h7qh" Mar 16 00:21:00 crc kubenswrapper[4816]: I0316 00:21:00.986219 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/842bb112-f402-4717-bc56-f488fc3c5db7-ovnkube-script-lib\") pod \"ovnkube-node-8h7qh\" (UID: \"842bb112-f402-4717-bc56-f488fc3c5db7\") " pod="openshift-ovn-kubernetes/ovnkube-node-8h7qh" Mar 16 00:21:00 crc kubenswrapper[4816]: I0316 00:21:00.986304 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/842bb112-f402-4717-bc56-f488fc3c5db7-ovn-node-metrics-cert\") pod \"ovnkube-node-8h7qh\" (UID: \"842bb112-f402-4717-bc56-f488fc3c5db7\") " pod="openshift-ovn-kubernetes/ovnkube-node-8h7qh" Mar 16 00:21:00 crc kubenswrapper[4816]: I0316 00:21:00.986342 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/842bb112-f402-4717-bc56-f488fc3c5db7-run-systemd\") pod \"ovnkube-node-8h7qh\" (UID: \"842bb112-f402-4717-bc56-f488fc3c5db7\") " pod="openshift-ovn-kubernetes/ovnkube-node-8h7qh" Mar 16 00:21:00 crc kubenswrapper[4816]: I0316 00:21:00.986374 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2qpl2\" (UniqueName: \"kubernetes.io/projected/842bb112-f402-4717-bc56-f488fc3c5db7-kube-api-access-2qpl2\") pod \"ovnkube-node-8h7qh\" (UID: \"842bb112-f402-4717-bc56-f488fc3c5db7\") " pod="openshift-ovn-kubernetes/ovnkube-node-8h7qh" Mar 16 00:21:00 crc kubenswrapper[4816]: I0316 00:21:00.986443 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/842bb112-f402-4717-bc56-f488fc3c5db7-host-slash\") pod \"ovnkube-node-8h7qh\" (UID: \"842bb112-f402-4717-bc56-f488fc3c5db7\") " pod="openshift-ovn-kubernetes/ovnkube-node-8h7qh" Mar 16 00:21:00 crc kubenswrapper[4816]: I0316 00:21:00.986509 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/842bb112-f402-4717-bc56-f488fc3c5db7-host-cni-bin\") pod \"ovnkube-node-8h7qh\" (UID: \"842bb112-f402-4717-bc56-f488fc3c5db7\") " pod="openshift-ovn-kubernetes/ovnkube-node-8h7qh" Mar 16 00:21:00 crc kubenswrapper[4816]: I0316 00:21:00.986590 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/842bb112-f402-4717-bc56-f488fc3c5db7-run-ovn\") pod \"ovnkube-node-8h7qh\" (UID: \"842bb112-f402-4717-bc56-f488fc3c5db7\") " pod="openshift-ovn-kubernetes/ovnkube-node-8h7qh" Mar 16 00:21:00 crc kubenswrapper[4816]: I0316 00:21:00.986622 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/842bb112-f402-4717-bc56-f488fc3c5db7-host-run-ovn-kubernetes\") pod \"ovnkube-node-8h7qh\" (UID: \"842bb112-f402-4717-bc56-f488fc3c5db7\") " pod="openshift-ovn-kubernetes/ovnkube-node-8h7qh" Mar 16 00:21:00 crc kubenswrapper[4816]: I0316 00:21:00.986714 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/842bb112-f402-4717-bc56-f488fc3c5db7-host-run-netns\") pod \"ovnkube-node-8h7qh\" (UID: \"842bb112-f402-4717-bc56-f488fc3c5db7\") " pod="openshift-ovn-kubernetes/ovnkube-node-8h7qh" Mar 16 00:21:00 crc kubenswrapper[4816]: I0316 00:21:00.986743 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/842bb112-f402-4717-bc56-f488fc3c5db7-etc-openvswitch\") pod \"ovnkube-node-8h7qh\" (UID: \"842bb112-f402-4717-bc56-f488fc3c5db7\") " pod="openshift-ovn-kubernetes/ovnkube-node-8h7qh" Mar 16 00:21:00 crc kubenswrapper[4816]: I0316 00:21:00.986771 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/842bb112-f402-4717-bc56-f488fc3c5db7-host-kubelet\") pod \"ovnkube-node-8h7qh\" (UID: \"842bb112-f402-4717-bc56-f488fc3c5db7\") " pod="openshift-ovn-kubernetes/ovnkube-node-8h7qh" Mar 16 00:21:00 crc kubenswrapper[4816]: I0316 00:21:00.986846 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/842bb112-f402-4717-bc56-f488fc3c5db7-ovnkube-config\") pod \"ovnkube-node-8h7qh\" (UID: \"842bb112-f402-4717-bc56-f488fc3c5db7\") " pod="openshift-ovn-kubernetes/ovnkube-node-8h7qh" Mar 16 00:21:00 crc kubenswrapper[4816]: I0316 00:21:00.986888 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/842bb112-f402-4717-bc56-f488fc3c5db7-run-openvswitch\") pod \"ovnkube-node-8h7qh\" (UID: \"842bb112-f402-4717-bc56-f488fc3c5db7\") " pod="openshift-ovn-kubernetes/ovnkube-node-8h7qh" Mar 16 00:21:00 crc kubenswrapper[4816]: I0316 00:21:00.986923 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/842bb112-f402-4717-bc56-f488fc3c5db7-host-cni-netd\") pod \"ovnkube-node-8h7qh\" (UID: \"842bb112-f402-4717-bc56-f488fc3c5db7\") " pod="openshift-ovn-kubernetes/ovnkube-node-8h7qh" Mar 16 00:21:00 crc kubenswrapper[4816]: I0316 00:21:00.986954 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/842bb112-f402-4717-bc56-f488fc3c5db7-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-8h7qh\" (UID: \"842bb112-f402-4717-bc56-f488fc3c5db7\") " pod="openshift-ovn-kubernetes/ovnkube-node-8h7qh" Mar 16 00:21:00 crc kubenswrapper[4816]: I0316 00:21:00.986998 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/842bb112-f402-4717-bc56-f488fc3c5db7-node-log\") pod \"ovnkube-node-8h7qh\" (UID: \"842bb112-f402-4717-bc56-f488fc3c5db7\") " pod="openshift-ovn-kubernetes/ovnkube-node-8h7qh" Mar 16 00:21:00 crc kubenswrapper[4816]: I0316 00:21:00.987091 4816 reconciler_common.go:293] "Volume detached for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88-var-lib-openvswitch\") on node \"crc\" DevicePath \"\"" Mar 16 00:21:00 crc kubenswrapper[4816]: I0316 00:21:00.987113 4816 reconciler_common.go:293] "Volume detached for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88-host-run-netns\") on node \"crc\" DevicePath \"\"" Mar 16 00:21:00 crc kubenswrapper[4816]: I0316 00:21:00.987134 4816 reconciler_common.go:293] "Volume detached for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88-etc-openvswitch\") on node \"crc\" DevicePath \"\"" Mar 16 00:21:00 crc kubenswrapper[4816]: I0316 00:21:00.987153 4816 reconciler_common.go:293] "Volume detached for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88-host-var-lib-cni-networks-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Mar 16 00:21:00 crc kubenswrapper[4816]: I0316 00:21:00.987170 4816 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88-ovnkube-config\") on node \"crc\" DevicePath \"\"" Mar 16 00:21:00 crc kubenswrapper[4816]: I0316 00:21:00.987186 4816 reconciler_common.go:293] "Volume detached for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88-host-cni-bin\") on node \"crc\" DevicePath \"\"" Mar 16 00:21:00 crc kubenswrapper[4816]: I0316 00:21:00.987201 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9rd68\" (UniqueName: \"kubernetes.io/projected/2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88-kube-api-access-9rd68\") on node \"crc\" DevicePath \"\"" Mar 16 00:21:00 crc kubenswrapper[4816]: I0316 00:21:00.987216 4816 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Mar 16 00:21:00 crc kubenswrapper[4816]: I0316 00:21:00.987258 4816 reconciler_common.go:293] "Volume detached for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88-host-kubelet\") on node \"crc\" DevicePath \"\"" Mar 16 00:21:00 crc kubenswrapper[4816]: I0316 00:21:00.987274 4816 reconciler_common.go:293] "Volume detached for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88-systemd-units\") on node \"crc\" DevicePath \"\"" Mar 16 00:21:00 crc kubenswrapper[4816]: I0316 00:21:00.987289 4816 reconciler_common.go:293] "Volume detached for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88-host-run-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Mar 16 00:21:00 crc kubenswrapper[4816]: I0316 00:21:00.987323 4816 reconciler_common.go:293] "Volume detached for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88-log-socket\") on node \"crc\" DevicePath \"\"" Mar 16 00:21:00 crc kubenswrapper[4816]: I0316 00:21:00.987338 4816 reconciler_common.go:293] "Volume detached for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88-run-ovn\") on node \"crc\" DevicePath \"\"" Mar 16 00:21:00 crc kubenswrapper[4816]: I0316 00:21:00.987352 4816 reconciler_common.go:293] "Volume detached for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88-host-cni-netd\") on node \"crc\" DevicePath \"\"" Mar 16 00:21:00 crc kubenswrapper[4816]: I0316 00:21:00.987368 4816 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Mar 16 00:21:00 crc kubenswrapper[4816]: I0316 00:21:00.987384 4816 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88-env-overrides\") on node \"crc\" DevicePath \"\"" Mar 16 00:21:00 crc kubenswrapper[4816]: I0316 00:21:00.987398 4816 reconciler_common.go:293] "Volume detached for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88-run-openvswitch\") on node \"crc\" DevicePath \"\"" Mar 16 00:21:00 crc kubenswrapper[4816]: I0316 00:21:00.987413 4816 reconciler_common.go:293] "Volume detached for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88-node-log\") on node \"crc\" DevicePath \"\"" Mar 16 00:21:00 crc kubenswrapper[4816]: I0316 00:21:00.987426 4816 reconciler_common.go:293] "Volume detached for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88-host-slash\") on node \"crc\" DevicePath \"\"" Mar 16 00:21:00 crc kubenswrapper[4816]: I0316 00:21:00.987441 4816 reconciler_common.go:293] "Volume detached for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88-run-systemd\") on node \"crc\" DevicePath \"\"" Mar 16 00:21:01 crc kubenswrapper[4816]: I0316 00:21:01.089067 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/842bb112-f402-4717-bc56-f488fc3c5db7-var-lib-openvswitch\") pod \"ovnkube-node-8h7qh\" (UID: \"842bb112-f402-4717-bc56-f488fc3c5db7\") " pod="openshift-ovn-kubernetes/ovnkube-node-8h7qh" Mar 16 00:21:01 crc kubenswrapper[4816]: I0316 00:21:01.089172 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/842bb112-f402-4717-bc56-f488fc3c5db7-log-socket\") pod \"ovnkube-node-8h7qh\" (UID: \"842bb112-f402-4717-bc56-f488fc3c5db7\") " pod="openshift-ovn-kubernetes/ovnkube-node-8h7qh" Mar 16 00:21:01 crc kubenswrapper[4816]: I0316 00:21:01.089209 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/842bb112-f402-4717-bc56-f488fc3c5db7-ovnkube-script-lib\") pod \"ovnkube-node-8h7qh\" (UID: \"842bb112-f402-4717-bc56-f488fc3c5db7\") " pod="openshift-ovn-kubernetes/ovnkube-node-8h7qh" Mar 16 00:21:01 crc kubenswrapper[4816]: I0316 00:21:01.089263 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/842bb112-f402-4717-bc56-f488fc3c5db7-var-lib-openvswitch\") pod \"ovnkube-node-8h7qh\" (UID: \"842bb112-f402-4717-bc56-f488fc3c5db7\") " pod="openshift-ovn-kubernetes/ovnkube-node-8h7qh" Mar 16 00:21:01 crc kubenswrapper[4816]: I0316 00:21:01.089312 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/842bb112-f402-4717-bc56-f488fc3c5db7-log-socket\") pod \"ovnkube-node-8h7qh\" (UID: \"842bb112-f402-4717-bc56-f488fc3c5db7\") " pod="openshift-ovn-kubernetes/ovnkube-node-8h7qh" Mar 16 00:21:01 crc kubenswrapper[4816]: I0316 00:21:01.089343 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/842bb112-f402-4717-bc56-f488fc3c5db7-ovn-node-metrics-cert\") pod \"ovnkube-node-8h7qh\" (UID: \"842bb112-f402-4717-bc56-f488fc3c5db7\") " pod="openshift-ovn-kubernetes/ovnkube-node-8h7qh" Mar 16 00:21:01 crc kubenswrapper[4816]: I0316 00:21:01.089409 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/842bb112-f402-4717-bc56-f488fc3c5db7-run-systemd\") pod \"ovnkube-node-8h7qh\" (UID: \"842bb112-f402-4717-bc56-f488fc3c5db7\") " pod="openshift-ovn-kubernetes/ovnkube-node-8h7qh" Mar 16 00:21:01 crc kubenswrapper[4816]: I0316 00:21:01.089462 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2qpl2\" (UniqueName: \"kubernetes.io/projected/842bb112-f402-4717-bc56-f488fc3c5db7-kube-api-access-2qpl2\") pod \"ovnkube-node-8h7qh\" (UID: \"842bb112-f402-4717-bc56-f488fc3c5db7\") " pod="openshift-ovn-kubernetes/ovnkube-node-8h7qh" Mar 16 00:21:01 crc kubenswrapper[4816]: I0316 00:21:01.089503 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/842bb112-f402-4717-bc56-f488fc3c5db7-host-slash\") pod \"ovnkube-node-8h7qh\" (UID: \"842bb112-f402-4717-bc56-f488fc3c5db7\") " pod="openshift-ovn-kubernetes/ovnkube-node-8h7qh" Mar 16 00:21:01 crc kubenswrapper[4816]: I0316 00:21:01.089526 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/842bb112-f402-4717-bc56-f488fc3c5db7-host-cni-bin\") pod \"ovnkube-node-8h7qh\" (UID: \"842bb112-f402-4717-bc56-f488fc3c5db7\") " pod="openshift-ovn-kubernetes/ovnkube-node-8h7qh" Mar 16 00:21:01 crc kubenswrapper[4816]: I0316 00:21:01.089567 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/842bb112-f402-4717-bc56-f488fc3c5db7-run-ovn\") pod \"ovnkube-node-8h7qh\" (UID: \"842bb112-f402-4717-bc56-f488fc3c5db7\") " pod="openshift-ovn-kubernetes/ovnkube-node-8h7qh" Mar 16 00:21:01 crc kubenswrapper[4816]: I0316 00:21:01.089623 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/842bb112-f402-4717-bc56-f488fc3c5db7-host-cni-bin\") pod \"ovnkube-node-8h7qh\" (UID: \"842bb112-f402-4717-bc56-f488fc3c5db7\") " pod="openshift-ovn-kubernetes/ovnkube-node-8h7qh" Mar 16 00:21:01 crc kubenswrapper[4816]: I0316 00:21:01.089619 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/842bb112-f402-4717-bc56-f488fc3c5db7-run-systemd\") pod \"ovnkube-node-8h7qh\" (UID: \"842bb112-f402-4717-bc56-f488fc3c5db7\") " pod="openshift-ovn-kubernetes/ovnkube-node-8h7qh" Mar 16 00:21:01 crc kubenswrapper[4816]: I0316 00:21:01.089673 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/842bb112-f402-4717-bc56-f488fc3c5db7-host-run-ovn-kubernetes\") pod \"ovnkube-node-8h7qh\" (UID: \"842bb112-f402-4717-bc56-f488fc3c5db7\") " pod="openshift-ovn-kubernetes/ovnkube-node-8h7qh" Mar 16 00:21:01 crc kubenswrapper[4816]: I0316 00:21:01.089654 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/842bb112-f402-4717-bc56-f488fc3c5db7-run-ovn\") pod \"ovnkube-node-8h7qh\" (UID: \"842bb112-f402-4717-bc56-f488fc3c5db7\") " pod="openshift-ovn-kubernetes/ovnkube-node-8h7qh" Mar 16 00:21:01 crc kubenswrapper[4816]: I0316 00:21:01.089632 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/842bb112-f402-4717-bc56-f488fc3c5db7-host-slash\") pod \"ovnkube-node-8h7qh\" (UID: \"842bb112-f402-4717-bc56-f488fc3c5db7\") " pod="openshift-ovn-kubernetes/ovnkube-node-8h7qh" Mar 16 00:21:01 crc kubenswrapper[4816]: I0316 00:21:01.089706 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/842bb112-f402-4717-bc56-f488fc3c5db7-host-run-ovn-kubernetes\") pod \"ovnkube-node-8h7qh\" (UID: \"842bb112-f402-4717-bc56-f488fc3c5db7\") " pod="openshift-ovn-kubernetes/ovnkube-node-8h7qh" Mar 16 00:21:01 crc kubenswrapper[4816]: I0316 00:21:01.089825 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/842bb112-f402-4717-bc56-f488fc3c5db7-host-run-netns\") pod \"ovnkube-node-8h7qh\" (UID: \"842bb112-f402-4717-bc56-f488fc3c5db7\") " pod="openshift-ovn-kubernetes/ovnkube-node-8h7qh" Mar 16 00:21:01 crc kubenswrapper[4816]: I0316 00:21:01.089870 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/842bb112-f402-4717-bc56-f488fc3c5db7-etc-openvswitch\") pod \"ovnkube-node-8h7qh\" (UID: \"842bb112-f402-4717-bc56-f488fc3c5db7\") " pod="openshift-ovn-kubernetes/ovnkube-node-8h7qh" Mar 16 00:21:01 crc kubenswrapper[4816]: I0316 00:21:01.089893 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/842bb112-f402-4717-bc56-f488fc3c5db7-host-kubelet\") pod \"ovnkube-node-8h7qh\" (UID: \"842bb112-f402-4717-bc56-f488fc3c5db7\") " pod="openshift-ovn-kubernetes/ovnkube-node-8h7qh" Mar 16 00:21:01 crc kubenswrapper[4816]: I0316 00:21:01.089923 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/842bb112-f402-4717-bc56-f488fc3c5db7-ovnkube-config\") pod \"ovnkube-node-8h7qh\" (UID: \"842bb112-f402-4717-bc56-f488fc3c5db7\") " pod="openshift-ovn-kubernetes/ovnkube-node-8h7qh" Mar 16 00:21:01 crc kubenswrapper[4816]: I0316 00:21:01.089950 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/842bb112-f402-4717-bc56-f488fc3c5db7-host-run-netns\") pod \"ovnkube-node-8h7qh\" (UID: \"842bb112-f402-4717-bc56-f488fc3c5db7\") " pod="openshift-ovn-kubernetes/ovnkube-node-8h7qh" Mar 16 00:21:01 crc kubenswrapper[4816]: I0316 00:21:01.089960 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/842bb112-f402-4717-bc56-f488fc3c5db7-run-openvswitch\") pod \"ovnkube-node-8h7qh\" (UID: \"842bb112-f402-4717-bc56-f488fc3c5db7\") " pod="openshift-ovn-kubernetes/ovnkube-node-8h7qh" Mar 16 00:21:01 crc kubenswrapper[4816]: I0316 00:21:01.089990 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/842bb112-f402-4717-bc56-f488fc3c5db7-run-openvswitch\") pod \"ovnkube-node-8h7qh\" (UID: \"842bb112-f402-4717-bc56-f488fc3c5db7\") " pod="openshift-ovn-kubernetes/ovnkube-node-8h7qh" Mar 16 00:21:01 crc kubenswrapper[4816]: I0316 00:21:01.090042 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/842bb112-f402-4717-bc56-f488fc3c5db7-etc-openvswitch\") pod \"ovnkube-node-8h7qh\" (UID: \"842bb112-f402-4717-bc56-f488fc3c5db7\") " pod="openshift-ovn-kubernetes/ovnkube-node-8h7qh" Mar 16 00:21:01 crc kubenswrapper[4816]: I0316 00:21:01.090065 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/842bb112-f402-4717-bc56-f488fc3c5db7-host-cni-netd\") pod \"ovnkube-node-8h7qh\" (UID: \"842bb112-f402-4717-bc56-f488fc3c5db7\") " pod="openshift-ovn-kubernetes/ovnkube-node-8h7qh" Mar 16 00:21:01 crc kubenswrapper[4816]: I0316 00:21:01.090096 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/842bb112-f402-4717-bc56-f488fc3c5db7-host-kubelet\") pod \"ovnkube-node-8h7qh\" (UID: \"842bb112-f402-4717-bc56-f488fc3c5db7\") " pod="openshift-ovn-kubernetes/ovnkube-node-8h7qh" Mar 16 00:21:01 crc kubenswrapper[4816]: I0316 00:21:01.090102 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/842bb112-f402-4717-bc56-f488fc3c5db7-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-8h7qh\" (UID: \"842bb112-f402-4717-bc56-f488fc3c5db7\") " pod="openshift-ovn-kubernetes/ovnkube-node-8h7qh" Mar 16 00:21:01 crc kubenswrapper[4816]: I0316 00:21:01.090172 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/842bb112-f402-4717-bc56-f488fc3c5db7-node-log\") pod \"ovnkube-node-8h7qh\" (UID: \"842bb112-f402-4717-bc56-f488fc3c5db7\") " pod="openshift-ovn-kubernetes/ovnkube-node-8h7qh" Mar 16 00:21:01 crc kubenswrapper[4816]: I0316 00:21:01.090211 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/842bb112-f402-4717-bc56-f488fc3c5db7-systemd-units\") pod \"ovnkube-node-8h7qh\" (UID: \"842bb112-f402-4717-bc56-f488fc3c5db7\") " pod="openshift-ovn-kubernetes/ovnkube-node-8h7qh" Mar 16 00:21:01 crc kubenswrapper[4816]: I0316 00:21:01.090255 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/842bb112-f402-4717-bc56-f488fc3c5db7-env-overrides\") pod \"ovnkube-node-8h7qh\" (UID: \"842bb112-f402-4717-bc56-f488fc3c5db7\") " pod="openshift-ovn-kubernetes/ovnkube-node-8h7qh" Mar 16 00:21:01 crc kubenswrapper[4816]: I0316 00:21:01.090531 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/842bb112-f402-4717-bc56-f488fc3c5db7-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-8h7qh\" (UID: \"842bb112-f402-4717-bc56-f488fc3c5db7\") " pod="openshift-ovn-kubernetes/ovnkube-node-8h7qh" Mar 16 00:21:01 crc kubenswrapper[4816]: I0316 00:21:01.090582 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/842bb112-f402-4717-bc56-f488fc3c5db7-node-log\") pod \"ovnkube-node-8h7qh\" (UID: \"842bb112-f402-4717-bc56-f488fc3c5db7\") " pod="openshift-ovn-kubernetes/ovnkube-node-8h7qh" Mar 16 00:21:01 crc kubenswrapper[4816]: I0316 00:21:01.090596 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/842bb112-f402-4717-bc56-f488fc3c5db7-systemd-units\") pod \"ovnkube-node-8h7qh\" (UID: \"842bb112-f402-4717-bc56-f488fc3c5db7\") " pod="openshift-ovn-kubernetes/ovnkube-node-8h7qh" Mar 16 00:21:01 crc kubenswrapper[4816]: I0316 00:21:01.090627 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/842bb112-f402-4717-bc56-f488fc3c5db7-host-cni-netd\") pod \"ovnkube-node-8h7qh\" (UID: \"842bb112-f402-4717-bc56-f488fc3c5db7\") " pod="openshift-ovn-kubernetes/ovnkube-node-8h7qh" Mar 16 00:21:01 crc kubenswrapper[4816]: I0316 00:21:01.090953 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/842bb112-f402-4717-bc56-f488fc3c5db7-ovnkube-script-lib\") pod \"ovnkube-node-8h7qh\" (UID: \"842bb112-f402-4717-bc56-f488fc3c5db7\") " pod="openshift-ovn-kubernetes/ovnkube-node-8h7qh" Mar 16 00:21:01 crc kubenswrapper[4816]: I0316 00:21:01.091142 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/842bb112-f402-4717-bc56-f488fc3c5db7-ovnkube-config\") pod \"ovnkube-node-8h7qh\" (UID: \"842bb112-f402-4717-bc56-f488fc3c5db7\") " pod="openshift-ovn-kubernetes/ovnkube-node-8h7qh" Mar 16 00:21:01 crc kubenswrapper[4816]: I0316 00:21:01.091231 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/842bb112-f402-4717-bc56-f488fc3c5db7-env-overrides\") pod \"ovnkube-node-8h7qh\" (UID: \"842bb112-f402-4717-bc56-f488fc3c5db7\") " pod="openshift-ovn-kubernetes/ovnkube-node-8h7qh" Mar 16 00:21:01 crc kubenswrapper[4816]: I0316 00:21:01.094909 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/842bb112-f402-4717-bc56-f488fc3c5db7-ovn-node-metrics-cert\") pod \"ovnkube-node-8h7qh\" (UID: \"842bb112-f402-4717-bc56-f488fc3c5db7\") " pod="openshift-ovn-kubernetes/ovnkube-node-8h7qh" Mar 16 00:21:01 crc kubenswrapper[4816]: I0316 00:21:01.110355 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2qpl2\" (UniqueName: \"kubernetes.io/projected/842bb112-f402-4717-bc56-f488fc3c5db7-kube-api-access-2qpl2\") pod \"ovnkube-node-8h7qh\" (UID: \"842bb112-f402-4717-bc56-f488fc3c5db7\") " pod="openshift-ovn-kubernetes/ovnkube-node-8h7qh" Mar 16 00:21:01 crc kubenswrapper[4816]: I0316 00:21:01.126015 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-8h7qh" Mar 16 00:21:01 crc kubenswrapper[4816]: W0316 00:21:01.150141 4816 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod842bb112_f402_4717_bc56_f488fc3c5db7.slice/crio-f8b857aaf91a1aa57b19fc29051ada8ec2a1a9b761c683ed3e9fa9e5d97497f5 WatchSource:0}: Error finding container f8b857aaf91a1aa57b19fc29051ada8ec2a1a9b761c683ed3e9fa9e5d97497f5: Status 404 returned error can't find the container with id f8b857aaf91a1aa57b19fc29051ada8ec2a1a9b761c683ed3e9fa9e5d97497f5 Mar 16 00:21:01 crc kubenswrapper[4816]: I0316 00:21:01.728388 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-psjs7_2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88/ovn-acl-logging/0.log" Mar 16 00:21:01 crc kubenswrapper[4816]: I0316 00:21:01.729447 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-psjs7_2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88/ovn-controller/0.log" Mar 16 00:21:01 crc kubenswrapper[4816]: I0316 00:21:01.730201 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-psjs7" Mar 16 00:21:01 crc kubenswrapper[4816]: I0316 00:21:01.731960 4816 generic.go:334] "Generic (PLEG): container finished" podID="842bb112-f402-4717-bc56-f488fc3c5db7" containerID="692279448e8a204929ac728470e76248b7a686ab449a7b870b91a40eb34e40de" exitCode=0 Mar 16 00:21:01 crc kubenswrapper[4816]: I0316 00:21:01.732020 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8h7qh" event={"ID":"842bb112-f402-4717-bc56-f488fc3c5db7","Type":"ContainerDied","Data":"692279448e8a204929ac728470e76248b7a686ab449a7b870b91a40eb34e40de"} Mar 16 00:21:01 crc kubenswrapper[4816]: I0316 00:21:01.732041 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8h7qh" event={"ID":"842bb112-f402-4717-bc56-f488fc3c5db7","Type":"ContainerStarted","Data":"f8b857aaf91a1aa57b19fc29051ada8ec2a1a9b761c683ed3e9fa9e5d97497f5"} Mar 16 00:21:01 crc kubenswrapper[4816]: I0316 00:21:01.735086 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-szscw_e9789e58-12c8-4831-9401-af48a3e92209/kube-multus/2.log" Mar 16 00:21:01 crc kubenswrapper[4816]: I0316 00:21:01.735130 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-szscw" event={"ID":"e9789e58-12c8-4831-9401-af48a3e92209","Type":"ContainerStarted","Data":"4bde4ed98c5f5d1c0d8946acfc8cc13121f014a1d939f0f14b6cd0165659d331"} Mar 16 00:21:01 crc kubenswrapper[4816]: I0316 00:21:01.818898 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-psjs7"] Mar 16 00:21:01 crc kubenswrapper[4816]: I0316 00:21:01.821987 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-psjs7"] Mar 16 00:21:03 crc kubenswrapper[4816]: I0316 00:21:03.476230 4816 scope.go:117] "RemoveContainer" containerID="1249f8b1c01db98230c10bde49bdb68f17caccc1ba0546e462df4384ca1658dd" Mar 16 00:21:03 crc kubenswrapper[4816]: I0316 00:21:03.492774 4816 scope.go:117] "RemoveContainer" containerID="4d6bf581280690c99ddbbecd4cf4e215a12230d377978aa48acd3c05b85a3c56" Mar 16 00:21:03 crc kubenswrapper[4816]: I0316 00:21:03.511354 4816 scope.go:117] "RemoveContainer" containerID="d0a220f8f08fc88ffdf56d37ec2ba1b59974be62f3a81d988b1462b4794a79a8" Mar 16 00:21:03 crc kubenswrapper[4816]: I0316 00:21:03.541221 4816 scope.go:117] "RemoveContainer" containerID="4e0a510814d60106574e3cd3e612c031ac900caa0f24a12a62d7cd1f6bfda1ed" Mar 16 00:21:03 crc kubenswrapper[4816]: I0316 00:21:03.563284 4816 scope.go:117] "RemoveContainer" containerID="f47b6c54d5d72d827208f4d8228be260d9d19b4a720b3174349288dee2916641" Mar 16 00:21:03 crc kubenswrapper[4816]: I0316 00:21:03.581353 4816 scope.go:117] "RemoveContainer" containerID="0fac0a986edf984b0e34eda0f969205d5bc92f23d0edacb3c6dd8721eaed2fb9" Mar 16 00:21:03 crc kubenswrapper[4816]: I0316 00:21:03.602473 4816 scope.go:117] "RemoveContainer" containerID="aa5f052d32ad8f52f33ee5baea5af98a64f05b831b86ebced6c9422fb4845fcf" Mar 16 00:21:03 crc kubenswrapper[4816]: I0316 00:21:03.619626 4816 scope.go:117] "RemoveContainer" containerID="826495d8ab8f414169bf36159278da0a040fec699a74b4aebdc8f87ccdc48bd6" Mar 16 00:21:03 crc kubenswrapper[4816]: I0316 00:21:03.632076 4816 scope.go:117] "RemoveContainer" containerID="86c28f1ae2f2fb61b7755ac14bc7f5346036735c75daccaa0e31b4e606cfb4c2" Mar 16 00:21:03 crc kubenswrapper[4816]: I0316 00:21:03.657041 4816 scope.go:117] "RemoveContainer" containerID="7e5d87dc1889484bb2175c0613eda0b852c65a289a1c165f6adae2a822892aa2" Mar 16 00:21:03 crc kubenswrapper[4816]: I0316 00:21:03.675662 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88" path="/var/lib/kubelet/pods/2ca6e6b1-6b9c-4bb0-8e08-8201c9c53e88/volumes" Mar 16 00:21:03 crc kubenswrapper[4816]: I0316 00:21:03.750781 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8h7qh" event={"ID":"842bb112-f402-4717-bc56-f488fc3c5db7","Type":"ContainerStarted","Data":"ab1b71595d8967f55cac92c5aec109d46f21632056549e44c80fd734ab566962"} Mar 16 00:21:03 crc kubenswrapper[4816]: I0316 00:21:03.750825 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8h7qh" event={"ID":"842bb112-f402-4717-bc56-f488fc3c5db7","Type":"ContainerStarted","Data":"a7f8583fc78de60d0291b7007caac080e35b5233fc3c3df2e84a937546c476de"} Mar 16 00:21:03 crc kubenswrapper[4816]: I0316 00:21:03.750840 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8h7qh" event={"ID":"842bb112-f402-4717-bc56-f488fc3c5db7","Type":"ContainerStarted","Data":"eadd6738e6ec7982f52dd38e85081dc5f198ede817985659bad8d0a6b04e7d06"} Mar 16 00:21:03 crc kubenswrapper[4816]: I0316 00:21:03.750866 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8h7qh" event={"ID":"842bb112-f402-4717-bc56-f488fc3c5db7","Type":"ContainerStarted","Data":"ed9254fe0b52cf509ff910fae8230bfbb9e723a664c7eef81c0fb7062492d7a8"} Mar 16 00:21:03 crc kubenswrapper[4816]: I0316 00:21:03.750878 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8h7qh" event={"ID":"842bb112-f402-4717-bc56-f488fc3c5db7","Type":"ContainerStarted","Data":"c6ff289266bc8c827c219dad4dce737231a4d8d8fa08cdd4e0574caa885f065c"} Mar 16 00:21:03 crc kubenswrapper[4816]: I0316 00:21:03.750889 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8h7qh" event={"ID":"842bb112-f402-4717-bc56-f488fc3c5db7","Type":"ContainerStarted","Data":"5ea9560161ac08ce1fb2fa03da7fc937a29a00f0220b6b4f6421faa145d093b5"} Mar 16 00:21:06 crc kubenswrapper[4816]: I0316 00:21:06.773996 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8h7qh" event={"ID":"842bb112-f402-4717-bc56-f488fc3c5db7","Type":"ContainerStarted","Data":"af5edc529512fefb0161b3b8a090063a6891a4dc66e4e9f4a0046f57b5544748"} Mar 16 00:21:08 crc kubenswrapper[4816]: I0316 00:21:08.791011 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8h7qh" event={"ID":"842bb112-f402-4717-bc56-f488fc3c5db7","Type":"ContainerStarted","Data":"1a51743713484f00c09cc3c0386f957872cc2f2fbe6db30d66224ab6a7bdabd5"} Mar 16 00:21:08 crc kubenswrapper[4816]: I0316 00:21:08.791342 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-8h7qh" Mar 16 00:21:08 crc kubenswrapper[4816]: I0316 00:21:08.791356 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-8h7qh" Mar 16 00:21:08 crc kubenswrapper[4816]: I0316 00:21:08.791364 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-8h7qh" Mar 16 00:21:08 crc kubenswrapper[4816]: I0316 00:21:08.820129 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-8h7qh" Mar 16 00:21:08 crc kubenswrapper[4816]: I0316 00:21:08.821887 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-8h7qh" Mar 16 00:21:08 crc kubenswrapper[4816]: I0316 00:21:08.825943 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-8h7qh" podStartSLOduration=8.825925989 podStartE2EDuration="8.825925989s" podCreationTimestamp="2026-03-16 00:21:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-16 00:21:08.822866289 +0000 UTC m=+861.919166292" watchObservedRunningTime="2026-03-16 00:21:08.825925989 +0000 UTC m=+861.922225942" Mar 16 00:21:31 crc kubenswrapper[4816]: I0316 00:21:31.149923 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-8h7qh" Mar 16 00:21:31 crc kubenswrapper[4816]: I0316 00:21:31.864013 4816 patch_prober.go:28] interesting pod/machine-config-daemon-jrdcz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 16 00:21:31 crc kubenswrapper[4816]: I0316 00:21:31.864377 4816 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jrdcz" podUID="dd08ece2-7636-4966-973a-e96a34b70b53" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 16 00:22:00 crc kubenswrapper[4816]: I0316 00:22:00.137832 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29560342-qq7qg"] Mar 16 00:22:00 crc kubenswrapper[4816]: I0316 00:22:00.141651 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29560342-qq7qg" Mar 16 00:22:00 crc kubenswrapper[4816]: I0316 00:22:00.143988 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 16 00:22:00 crc kubenswrapper[4816]: I0316 00:22:00.144151 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-8hc2r" Mar 16 00:22:00 crc kubenswrapper[4816]: I0316 00:22:00.144177 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 16 00:22:00 crc kubenswrapper[4816]: I0316 00:22:00.144573 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29560342-qq7qg"] Mar 16 00:22:00 crc kubenswrapper[4816]: I0316 00:22:00.276629 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6vckw\" (UniqueName: \"kubernetes.io/projected/d60f1a00-e9c6-46ff-b5eb-f3c680f04736-kube-api-access-6vckw\") pod \"auto-csr-approver-29560342-qq7qg\" (UID: \"d60f1a00-e9c6-46ff-b5eb-f3c680f04736\") " pod="openshift-infra/auto-csr-approver-29560342-qq7qg" Mar 16 00:22:00 crc kubenswrapper[4816]: I0316 00:22:00.378440 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6vckw\" (UniqueName: \"kubernetes.io/projected/d60f1a00-e9c6-46ff-b5eb-f3c680f04736-kube-api-access-6vckw\") pod \"auto-csr-approver-29560342-qq7qg\" (UID: \"d60f1a00-e9c6-46ff-b5eb-f3c680f04736\") " pod="openshift-infra/auto-csr-approver-29560342-qq7qg" Mar 16 00:22:00 crc kubenswrapper[4816]: I0316 00:22:00.400391 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6vckw\" (UniqueName: \"kubernetes.io/projected/d60f1a00-e9c6-46ff-b5eb-f3c680f04736-kube-api-access-6vckw\") pod \"auto-csr-approver-29560342-qq7qg\" (UID: \"d60f1a00-e9c6-46ff-b5eb-f3c680f04736\") " pod="openshift-infra/auto-csr-approver-29560342-qq7qg" Mar 16 00:22:00 crc kubenswrapper[4816]: I0316 00:22:00.461203 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29560342-qq7qg" Mar 16 00:22:01 crc kubenswrapper[4816]: I0316 00:22:00.662902 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29560342-qq7qg"] Mar 16 00:22:01 crc kubenswrapper[4816]: I0316 00:22:00.669102 4816 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 16 00:22:01 crc kubenswrapper[4816]: I0316 00:22:01.093174 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29560342-qq7qg" event={"ID":"d60f1a00-e9c6-46ff-b5eb-f3c680f04736","Type":"ContainerStarted","Data":"4f4e0cce66b2e8f404303d0d5f05b8d4e1ec1593cfc9384ac6f5c65a0d46c71e"} Mar 16 00:22:01 crc kubenswrapper[4816]: I0316 00:22:01.863149 4816 patch_prober.go:28] interesting pod/machine-config-daemon-jrdcz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 16 00:22:01 crc kubenswrapper[4816]: I0316 00:22:01.863222 4816 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jrdcz" podUID="dd08ece2-7636-4966-973a-e96a34b70b53" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 16 00:22:02 crc kubenswrapper[4816]: I0316 00:22:02.100167 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29560342-qq7qg" event={"ID":"d60f1a00-e9c6-46ff-b5eb-f3c680f04736","Type":"ContainerStarted","Data":"dbd7c0bfa602e132787d7d6d843e255ebdb6acf34354466437ff4e5db80a17a7"} Mar 16 00:22:02 crc kubenswrapper[4816]: I0316 00:22:02.120431 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29560342-qq7qg" podStartSLOduration=1.047352211 podStartE2EDuration="2.120411942s" podCreationTimestamp="2026-03-16 00:22:00 +0000 UTC" firstStartedPulling="2026-03-16 00:22:00.668921977 +0000 UTC m=+913.765221930" lastFinishedPulling="2026-03-16 00:22:01.741981698 +0000 UTC m=+914.838281661" observedRunningTime="2026-03-16 00:22:02.118174407 +0000 UTC m=+915.214474370" watchObservedRunningTime="2026-03-16 00:22:02.120411942 +0000 UTC m=+915.216711895" Mar 16 00:22:03 crc kubenswrapper[4816]: I0316 00:22:03.109771 4816 generic.go:334] "Generic (PLEG): container finished" podID="d60f1a00-e9c6-46ff-b5eb-f3c680f04736" containerID="dbd7c0bfa602e132787d7d6d843e255ebdb6acf34354466437ff4e5db80a17a7" exitCode=0 Mar 16 00:22:03 crc kubenswrapper[4816]: I0316 00:22:03.109853 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29560342-qq7qg" event={"ID":"d60f1a00-e9c6-46ff-b5eb-f3c680f04736","Type":"ContainerDied","Data":"dbd7c0bfa602e132787d7d6d843e255ebdb6acf34354466437ff4e5db80a17a7"} Mar 16 00:22:04 crc kubenswrapper[4816]: I0316 00:22:04.351523 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29560342-qq7qg" Mar 16 00:22:04 crc kubenswrapper[4816]: I0316 00:22:04.432267 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6vckw\" (UniqueName: \"kubernetes.io/projected/d60f1a00-e9c6-46ff-b5eb-f3c680f04736-kube-api-access-6vckw\") pod \"d60f1a00-e9c6-46ff-b5eb-f3c680f04736\" (UID: \"d60f1a00-e9c6-46ff-b5eb-f3c680f04736\") " Mar 16 00:22:04 crc kubenswrapper[4816]: I0316 00:22:04.442350 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d60f1a00-e9c6-46ff-b5eb-f3c680f04736-kube-api-access-6vckw" (OuterVolumeSpecName: "kube-api-access-6vckw") pod "d60f1a00-e9c6-46ff-b5eb-f3c680f04736" (UID: "d60f1a00-e9c6-46ff-b5eb-f3c680f04736"). InnerVolumeSpecName "kube-api-access-6vckw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:22:04 crc kubenswrapper[4816]: I0316 00:22:04.459036 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-nvgvc"] Mar 16 00:22:04 crc kubenswrapper[4816]: I0316 00:22:04.459291 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-nvgvc" podUID="63fdf2e4-cd4d-40ad-ae9e-60642f5f47fa" containerName="registry-server" containerID="cri-o://5794c299dc8a2535c1228578a35665f886997b739f9467f169232f1cb504746a" gracePeriod=30 Mar 16 00:22:04 crc kubenswrapper[4816]: I0316 00:22:04.534104 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6vckw\" (UniqueName: \"kubernetes.io/projected/d60f1a00-e9c6-46ff-b5eb-f3c680f04736-kube-api-access-6vckw\") on node \"crc\" DevicePath \"\"" Mar 16 00:22:04 crc kubenswrapper[4816]: I0316 00:22:04.760220 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-nvgvc" Mar 16 00:22:04 crc kubenswrapper[4816]: I0316 00:22:04.836986 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/63fdf2e4-cd4d-40ad-ae9e-60642f5f47fa-catalog-content\") pod \"63fdf2e4-cd4d-40ad-ae9e-60642f5f47fa\" (UID: \"63fdf2e4-cd4d-40ad-ae9e-60642f5f47fa\") " Mar 16 00:22:04 crc kubenswrapper[4816]: I0316 00:22:04.837032 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/63fdf2e4-cd4d-40ad-ae9e-60642f5f47fa-utilities\") pod \"63fdf2e4-cd4d-40ad-ae9e-60642f5f47fa\" (UID: \"63fdf2e4-cd4d-40ad-ae9e-60642f5f47fa\") " Mar 16 00:22:04 crc kubenswrapper[4816]: I0316 00:22:04.837055 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qh6xk\" (UniqueName: \"kubernetes.io/projected/63fdf2e4-cd4d-40ad-ae9e-60642f5f47fa-kube-api-access-qh6xk\") pod \"63fdf2e4-cd4d-40ad-ae9e-60642f5f47fa\" (UID: \"63fdf2e4-cd4d-40ad-ae9e-60642f5f47fa\") " Mar 16 00:22:04 crc kubenswrapper[4816]: I0316 00:22:04.838900 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/63fdf2e4-cd4d-40ad-ae9e-60642f5f47fa-utilities" (OuterVolumeSpecName: "utilities") pod "63fdf2e4-cd4d-40ad-ae9e-60642f5f47fa" (UID: "63fdf2e4-cd4d-40ad-ae9e-60642f5f47fa"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 16 00:22:04 crc kubenswrapper[4816]: I0316 00:22:04.841431 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/63fdf2e4-cd4d-40ad-ae9e-60642f5f47fa-kube-api-access-qh6xk" (OuterVolumeSpecName: "kube-api-access-qh6xk") pod "63fdf2e4-cd4d-40ad-ae9e-60642f5f47fa" (UID: "63fdf2e4-cd4d-40ad-ae9e-60642f5f47fa"). InnerVolumeSpecName "kube-api-access-qh6xk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:22:04 crc kubenswrapper[4816]: I0316 00:22:04.863490 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/63fdf2e4-cd4d-40ad-ae9e-60642f5f47fa-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "63fdf2e4-cd4d-40ad-ae9e-60642f5f47fa" (UID: "63fdf2e4-cd4d-40ad-ae9e-60642f5f47fa"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 16 00:22:04 crc kubenswrapper[4816]: I0316 00:22:04.938963 4816 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/63fdf2e4-cd4d-40ad-ae9e-60642f5f47fa-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 16 00:22:04 crc kubenswrapper[4816]: I0316 00:22:04.939004 4816 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/63fdf2e4-cd4d-40ad-ae9e-60642f5f47fa-utilities\") on node \"crc\" DevicePath \"\"" Mar 16 00:22:04 crc kubenswrapper[4816]: I0316 00:22:04.939017 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qh6xk\" (UniqueName: \"kubernetes.io/projected/63fdf2e4-cd4d-40ad-ae9e-60642f5f47fa-kube-api-access-qh6xk\") on node \"crc\" DevicePath \"\"" Mar 16 00:22:05 crc kubenswrapper[4816]: I0316 00:22:05.122618 4816 generic.go:334] "Generic (PLEG): container finished" podID="63fdf2e4-cd4d-40ad-ae9e-60642f5f47fa" containerID="5794c299dc8a2535c1228578a35665f886997b739f9467f169232f1cb504746a" exitCode=0 Mar 16 00:22:05 crc kubenswrapper[4816]: I0316 00:22:05.122668 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-nvgvc" Mar 16 00:22:05 crc kubenswrapper[4816]: I0316 00:22:05.122692 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-nvgvc" event={"ID":"63fdf2e4-cd4d-40ad-ae9e-60642f5f47fa","Type":"ContainerDied","Data":"5794c299dc8a2535c1228578a35665f886997b739f9467f169232f1cb504746a"} Mar 16 00:22:05 crc kubenswrapper[4816]: I0316 00:22:05.122737 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-nvgvc" event={"ID":"63fdf2e4-cd4d-40ad-ae9e-60642f5f47fa","Type":"ContainerDied","Data":"3956bdc0939ca6c80a18b82143c55a4cfebb9af362a0d61193b0fe36b4f051bd"} Mar 16 00:22:05 crc kubenswrapper[4816]: I0316 00:22:05.122759 4816 scope.go:117] "RemoveContainer" containerID="5794c299dc8a2535c1228578a35665f886997b739f9467f169232f1cb504746a" Mar 16 00:22:05 crc kubenswrapper[4816]: I0316 00:22:05.123996 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29560342-qq7qg" event={"ID":"d60f1a00-e9c6-46ff-b5eb-f3c680f04736","Type":"ContainerDied","Data":"4f4e0cce66b2e8f404303d0d5f05b8d4e1ec1593cfc9384ac6f5c65a0d46c71e"} Mar 16 00:22:05 crc kubenswrapper[4816]: I0316 00:22:05.124027 4816 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4f4e0cce66b2e8f404303d0d5f05b8d4e1ec1593cfc9384ac6f5c65a0d46c71e" Mar 16 00:22:05 crc kubenswrapper[4816]: I0316 00:22:05.124065 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29560342-qq7qg" Mar 16 00:22:05 crc kubenswrapper[4816]: I0316 00:22:05.147088 4816 scope.go:117] "RemoveContainer" containerID="6af5777618386b35653fedfb44583f6c085ee0d0796e2336a254ffa7ee599f64" Mar 16 00:22:05 crc kubenswrapper[4816]: I0316 00:22:05.175193 4816 scope.go:117] "RemoveContainer" containerID="4810564b12a91bcd6219665353971cf8fa3c739485d9b178416c0e173435b096" Mar 16 00:22:05 crc kubenswrapper[4816]: I0316 00:22:05.180653 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-nvgvc"] Mar 16 00:22:05 crc kubenswrapper[4816]: I0316 00:22:05.184289 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-nvgvc"] Mar 16 00:22:05 crc kubenswrapper[4816]: I0316 00:22:05.189221 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29560336-fncq8"] Mar 16 00:22:05 crc kubenswrapper[4816]: I0316 00:22:05.190971 4816 scope.go:117] "RemoveContainer" containerID="5794c299dc8a2535c1228578a35665f886997b739f9467f169232f1cb504746a" Mar 16 00:22:05 crc kubenswrapper[4816]: E0316 00:22:05.191413 4816 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5794c299dc8a2535c1228578a35665f886997b739f9467f169232f1cb504746a\": container with ID starting with 5794c299dc8a2535c1228578a35665f886997b739f9467f169232f1cb504746a not found: ID does not exist" containerID="5794c299dc8a2535c1228578a35665f886997b739f9467f169232f1cb504746a" Mar 16 00:22:05 crc kubenswrapper[4816]: I0316 00:22:05.191469 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5794c299dc8a2535c1228578a35665f886997b739f9467f169232f1cb504746a"} err="failed to get container status \"5794c299dc8a2535c1228578a35665f886997b739f9467f169232f1cb504746a\": rpc error: code = NotFound desc = could not find container \"5794c299dc8a2535c1228578a35665f886997b739f9467f169232f1cb504746a\": container with ID starting with 5794c299dc8a2535c1228578a35665f886997b739f9467f169232f1cb504746a not found: ID does not exist" Mar 16 00:22:05 crc kubenswrapper[4816]: I0316 00:22:05.191515 4816 scope.go:117] "RemoveContainer" containerID="6af5777618386b35653fedfb44583f6c085ee0d0796e2336a254ffa7ee599f64" Mar 16 00:22:05 crc kubenswrapper[4816]: E0316 00:22:05.191875 4816 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6af5777618386b35653fedfb44583f6c085ee0d0796e2336a254ffa7ee599f64\": container with ID starting with 6af5777618386b35653fedfb44583f6c085ee0d0796e2336a254ffa7ee599f64 not found: ID does not exist" containerID="6af5777618386b35653fedfb44583f6c085ee0d0796e2336a254ffa7ee599f64" Mar 16 00:22:05 crc kubenswrapper[4816]: I0316 00:22:05.192029 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6af5777618386b35653fedfb44583f6c085ee0d0796e2336a254ffa7ee599f64"} err="failed to get container status \"6af5777618386b35653fedfb44583f6c085ee0d0796e2336a254ffa7ee599f64\": rpc error: code = NotFound desc = could not find container \"6af5777618386b35653fedfb44583f6c085ee0d0796e2336a254ffa7ee599f64\": container with ID starting with 6af5777618386b35653fedfb44583f6c085ee0d0796e2336a254ffa7ee599f64 not found: ID does not exist" Mar 16 00:22:05 crc kubenswrapper[4816]: I0316 00:22:05.192096 4816 scope.go:117] "RemoveContainer" containerID="4810564b12a91bcd6219665353971cf8fa3c739485d9b178416c0e173435b096" Mar 16 00:22:05 crc kubenswrapper[4816]: E0316 00:22:05.192584 4816 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4810564b12a91bcd6219665353971cf8fa3c739485d9b178416c0e173435b096\": container with ID starting with 4810564b12a91bcd6219665353971cf8fa3c739485d9b178416c0e173435b096 not found: ID does not exist" containerID="4810564b12a91bcd6219665353971cf8fa3c739485d9b178416c0e173435b096" Mar 16 00:22:05 crc kubenswrapper[4816]: I0316 00:22:05.192615 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4810564b12a91bcd6219665353971cf8fa3c739485d9b178416c0e173435b096"} err="failed to get container status \"4810564b12a91bcd6219665353971cf8fa3c739485d9b178416c0e173435b096\": rpc error: code = NotFound desc = could not find container \"4810564b12a91bcd6219665353971cf8fa3c739485d9b178416c0e173435b096\": container with ID starting with 4810564b12a91bcd6219665353971cf8fa3c739485d9b178416c0e173435b096 not found: ID does not exist" Mar 16 00:22:05 crc kubenswrapper[4816]: I0316 00:22:05.193723 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29560336-fncq8"] Mar 16 00:22:05 crc kubenswrapper[4816]: I0316 00:22:05.673700 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="63fdf2e4-cd4d-40ad-ae9e-60642f5f47fa" path="/var/lib/kubelet/pods/63fdf2e4-cd4d-40ad-ae9e-60642f5f47fa/volumes" Mar 16 00:22:05 crc kubenswrapper[4816]: I0316 00:22:05.674777 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b478a542-14c7-4cca-9f95-64766b34df27" path="/var/lib/kubelet/pods/b478a542-14c7-4cca-9f95-64766b34df27/volumes" Mar 16 00:22:08 crc kubenswrapper[4816]: I0316 00:22:08.431599 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f085xqb4"] Mar 16 00:22:08 crc kubenswrapper[4816]: E0316 00:22:08.432111 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="63fdf2e4-cd4d-40ad-ae9e-60642f5f47fa" containerName="extract-content" Mar 16 00:22:08 crc kubenswrapper[4816]: I0316 00:22:08.432123 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="63fdf2e4-cd4d-40ad-ae9e-60642f5f47fa" containerName="extract-content" Mar 16 00:22:08 crc kubenswrapper[4816]: E0316 00:22:08.432131 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="63fdf2e4-cd4d-40ad-ae9e-60642f5f47fa" containerName="registry-server" Mar 16 00:22:08 crc kubenswrapper[4816]: I0316 00:22:08.432137 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="63fdf2e4-cd4d-40ad-ae9e-60642f5f47fa" containerName="registry-server" Mar 16 00:22:08 crc kubenswrapper[4816]: E0316 00:22:08.432146 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="63fdf2e4-cd4d-40ad-ae9e-60642f5f47fa" containerName="extract-utilities" Mar 16 00:22:08 crc kubenswrapper[4816]: I0316 00:22:08.432152 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="63fdf2e4-cd4d-40ad-ae9e-60642f5f47fa" containerName="extract-utilities" Mar 16 00:22:08 crc kubenswrapper[4816]: E0316 00:22:08.432160 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d60f1a00-e9c6-46ff-b5eb-f3c680f04736" containerName="oc" Mar 16 00:22:08 crc kubenswrapper[4816]: I0316 00:22:08.432166 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="d60f1a00-e9c6-46ff-b5eb-f3c680f04736" containerName="oc" Mar 16 00:22:08 crc kubenswrapper[4816]: I0316 00:22:08.432245 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="63fdf2e4-cd4d-40ad-ae9e-60642f5f47fa" containerName="registry-server" Mar 16 00:22:08 crc kubenswrapper[4816]: I0316 00:22:08.432256 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="d60f1a00-e9c6-46ff-b5eb-f3c680f04736" containerName="oc" Mar 16 00:22:08 crc kubenswrapper[4816]: I0316 00:22:08.432968 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f085xqb4" Mar 16 00:22:08 crc kubenswrapper[4816]: I0316 00:22:08.436175 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Mar 16 00:22:08 crc kubenswrapper[4816]: I0316 00:22:08.445252 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f085xqb4"] Mar 16 00:22:08 crc kubenswrapper[4816]: I0316 00:22:08.582776 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/35d36436-ca87-48ef-9a68-484c2335bb33-bundle\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f085xqb4\" (UID: \"35d36436-ca87-48ef-9a68-484c2335bb33\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f085xqb4" Mar 16 00:22:08 crc kubenswrapper[4816]: I0316 00:22:08.582842 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wqn6m\" (UniqueName: \"kubernetes.io/projected/35d36436-ca87-48ef-9a68-484c2335bb33-kube-api-access-wqn6m\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f085xqb4\" (UID: \"35d36436-ca87-48ef-9a68-484c2335bb33\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f085xqb4" Mar 16 00:22:08 crc kubenswrapper[4816]: I0316 00:22:08.582939 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/35d36436-ca87-48ef-9a68-484c2335bb33-util\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f085xqb4\" (UID: \"35d36436-ca87-48ef-9a68-484c2335bb33\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f085xqb4" Mar 16 00:22:08 crc kubenswrapper[4816]: I0316 00:22:08.683700 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wqn6m\" (UniqueName: \"kubernetes.io/projected/35d36436-ca87-48ef-9a68-484c2335bb33-kube-api-access-wqn6m\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f085xqb4\" (UID: \"35d36436-ca87-48ef-9a68-484c2335bb33\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f085xqb4" Mar 16 00:22:08 crc kubenswrapper[4816]: I0316 00:22:08.683803 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/35d36436-ca87-48ef-9a68-484c2335bb33-util\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f085xqb4\" (UID: \"35d36436-ca87-48ef-9a68-484c2335bb33\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f085xqb4" Mar 16 00:22:08 crc kubenswrapper[4816]: I0316 00:22:08.683862 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/35d36436-ca87-48ef-9a68-484c2335bb33-bundle\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f085xqb4\" (UID: \"35d36436-ca87-48ef-9a68-484c2335bb33\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f085xqb4" Mar 16 00:22:08 crc kubenswrapper[4816]: I0316 00:22:08.684369 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/35d36436-ca87-48ef-9a68-484c2335bb33-bundle\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f085xqb4\" (UID: \"35d36436-ca87-48ef-9a68-484c2335bb33\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f085xqb4" Mar 16 00:22:08 crc kubenswrapper[4816]: I0316 00:22:08.684524 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/35d36436-ca87-48ef-9a68-484c2335bb33-util\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f085xqb4\" (UID: \"35d36436-ca87-48ef-9a68-484c2335bb33\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f085xqb4" Mar 16 00:22:08 crc kubenswrapper[4816]: I0316 00:22:08.706606 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wqn6m\" (UniqueName: \"kubernetes.io/projected/35d36436-ca87-48ef-9a68-484c2335bb33-kube-api-access-wqn6m\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f085xqb4\" (UID: \"35d36436-ca87-48ef-9a68-484c2335bb33\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f085xqb4" Mar 16 00:22:08 crc kubenswrapper[4816]: I0316 00:22:08.747517 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f085xqb4" Mar 16 00:22:08 crc kubenswrapper[4816]: I0316 00:22:08.943299 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f085xqb4"] Mar 16 00:22:09 crc kubenswrapper[4816]: I0316 00:22:09.154533 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f085xqb4" event={"ID":"35d36436-ca87-48ef-9a68-484c2335bb33","Type":"ContainerStarted","Data":"20fa11f8bf7bd536e4ee598e1c06d0e4f08ba15ea4cd82786ae11f1cff7ad5d5"} Mar 16 00:22:09 crc kubenswrapper[4816]: I0316 00:22:09.154633 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f085xqb4" event={"ID":"35d36436-ca87-48ef-9a68-484c2335bb33","Type":"ContainerStarted","Data":"b2e666dffac4d36453ae492f3bd8ee6e9e91e9b6a649968caa1dcade8b909136"} Mar 16 00:22:10 crc kubenswrapper[4816]: I0316 00:22:10.161116 4816 generic.go:334] "Generic (PLEG): container finished" podID="35d36436-ca87-48ef-9a68-484c2335bb33" containerID="20fa11f8bf7bd536e4ee598e1c06d0e4f08ba15ea4cd82786ae11f1cff7ad5d5" exitCode=0 Mar 16 00:22:10 crc kubenswrapper[4816]: I0316 00:22:10.161193 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f085xqb4" event={"ID":"35d36436-ca87-48ef-9a68-484c2335bb33","Type":"ContainerDied","Data":"20fa11f8bf7bd536e4ee598e1c06d0e4f08ba15ea4cd82786ae11f1cff7ad5d5"} Mar 16 00:22:11 crc kubenswrapper[4816]: I0316 00:22:11.381063 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-284nb"] Mar 16 00:22:11 crc kubenswrapper[4816]: I0316 00:22:11.382876 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-284nb" Mar 16 00:22:11 crc kubenswrapper[4816]: I0316 00:22:11.389012 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-284nb"] Mar 16 00:22:11 crc kubenswrapper[4816]: I0316 00:22:11.520434 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/449b2c21-4396-4d46-af73-e670b282f831-catalog-content\") pod \"redhat-operators-284nb\" (UID: \"449b2c21-4396-4d46-af73-e670b282f831\") " pod="openshift-marketplace/redhat-operators-284nb" Mar 16 00:22:11 crc kubenswrapper[4816]: I0316 00:22:11.520513 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/449b2c21-4396-4d46-af73-e670b282f831-utilities\") pod \"redhat-operators-284nb\" (UID: \"449b2c21-4396-4d46-af73-e670b282f831\") " pod="openshift-marketplace/redhat-operators-284nb" Mar 16 00:22:11 crc kubenswrapper[4816]: I0316 00:22:11.520612 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-44s4s\" (UniqueName: \"kubernetes.io/projected/449b2c21-4396-4d46-af73-e670b282f831-kube-api-access-44s4s\") pod \"redhat-operators-284nb\" (UID: \"449b2c21-4396-4d46-af73-e670b282f831\") " pod="openshift-marketplace/redhat-operators-284nb" Mar 16 00:22:11 crc kubenswrapper[4816]: I0316 00:22:11.622242 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-44s4s\" (UniqueName: \"kubernetes.io/projected/449b2c21-4396-4d46-af73-e670b282f831-kube-api-access-44s4s\") pod \"redhat-operators-284nb\" (UID: \"449b2c21-4396-4d46-af73-e670b282f831\") " pod="openshift-marketplace/redhat-operators-284nb" Mar 16 00:22:11 crc kubenswrapper[4816]: I0316 00:22:11.622838 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/449b2c21-4396-4d46-af73-e670b282f831-catalog-content\") pod \"redhat-operators-284nb\" (UID: \"449b2c21-4396-4d46-af73-e670b282f831\") " pod="openshift-marketplace/redhat-operators-284nb" Mar 16 00:22:11 crc kubenswrapper[4816]: I0316 00:22:11.622895 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/449b2c21-4396-4d46-af73-e670b282f831-utilities\") pod \"redhat-operators-284nb\" (UID: \"449b2c21-4396-4d46-af73-e670b282f831\") " pod="openshift-marketplace/redhat-operators-284nb" Mar 16 00:22:11 crc kubenswrapper[4816]: I0316 00:22:11.623381 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/449b2c21-4396-4d46-af73-e670b282f831-utilities\") pod \"redhat-operators-284nb\" (UID: \"449b2c21-4396-4d46-af73-e670b282f831\") " pod="openshift-marketplace/redhat-operators-284nb" Mar 16 00:22:11 crc kubenswrapper[4816]: I0316 00:22:11.623515 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/449b2c21-4396-4d46-af73-e670b282f831-catalog-content\") pod \"redhat-operators-284nb\" (UID: \"449b2c21-4396-4d46-af73-e670b282f831\") " pod="openshift-marketplace/redhat-operators-284nb" Mar 16 00:22:11 crc kubenswrapper[4816]: I0316 00:22:11.643203 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-44s4s\" (UniqueName: \"kubernetes.io/projected/449b2c21-4396-4d46-af73-e670b282f831-kube-api-access-44s4s\") pod \"redhat-operators-284nb\" (UID: \"449b2c21-4396-4d46-af73-e670b282f831\") " pod="openshift-marketplace/redhat-operators-284nb" Mar 16 00:22:11 crc kubenswrapper[4816]: I0316 00:22:11.723089 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-284nb" Mar 16 00:22:11 crc kubenswrapper[4816]: I0316 00:22:11.917034 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-284nb"] Mar 16 00:22:12 crc kubenswrapper[4816]: I0316 00:22:12.173511 4816 generic.go:334] "Generic (PLEG): container finished" podID="449b2c21-4396-4d46-af73-e670b282f831" containerID="1891eef8092d3a2994e7d2d76188c04ec2abc90123f32af569269e31014fdb64" exitCode=0 Mar 16 00:22:12 crc kubenswrapper[4816]: I0316 00:22:12.173704 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-284nb" event={"ID":"449b2c21-4396-4d46-af73-e670b282f831","Type":"ContainerDied","Data":"1891eef8092d3a2994e7d2d76188c04ec2abc90123f32af569269e31014fdb64"} Mar 16 00:22:12 crc kubenswrapper[4816]: I0316 00:22:12.173904 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-284nb" event={"ID":"449b2c21-4396-4d46-af73-e670b282f831","Type":"ContainerStarted","Data":"d18b65047ef76f8aa8382e87295979d6bc4c30a17f5c2e1a21242e76f4c64c95"} Mar 16 00:22:12 crc kubenswrapper[4816]: I0316 00:22:12.175601 4816 generic.go:334] "Generic (PLEG): container finished" podID="35d36436-ca87-48ef-9a68-484c2335bb33" containerID="666c84ff5793be20b033df37aa373da029267c23d1648b80c2fb628543806f79" exitCode=0 Mar 16 00:22:12 crc kubenswrapper[4816]: I0316 00:22:12.175637 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f085xqb4" event={"ID":"35d36436-ca87-48ef-9a68-484c2335bb33","Type":"ContainerDied","Data":"666c84ff5793be20b033df37aa373da029267c23d1648b80c2fb628543806f79"} Mar 16 00:22:13 crc kubenswrapper[4816]: I0316 00:22:13.183794 4816 generic.go:334] "Generic (PLEG): container finished" podID="35d36436-ca87-48ef-9a68-484c2335bb33" containerID="7656017a9f30cdee825f1f174f1a6b26741bc9f989f7fb768e5f61a6979f53e4" exitCode=0 Mar 16 00:22:13 crc kubenswrapper[4816]: I0316 00:22:13.183847 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f085xqb4" event={"ID":"35d36436-ca87-48ef-9a68-484c2335bb33","Type":"ContainerDied","Data":"7656017a9f30cdee825f1f174f1a6b26741bc9f989f7fb768e5f61a6979f53e4"} Mar 16 00:22:14 crc kubenswrapper[4816]: I0316 00:22:14.193609 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-284nb" event={"ID":"449b2c21-4396-4d46-af73-e670b282f831","Type":"ContainerStarted","Data":"6675088a9bc77bc6e858928c42b983a7c8adb51e5f6f5dc372c160b6c32fce59"} Mar 16 00:22:14 crc kubenswrapper[4816]: I0316 00:22:14.463065 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f085xqb4" Mar 16 00:22:14 crc kubenswrapper[4816]: I0316 00:22:14.635507 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/35d36436-ca87-48ef-9a68-484c2335bb33-util\") pod \"35d36436-ca87-48ef-9a68-484c2335bb33\" (UID: \"35d36436-ca87-48ef-9a68-484c2335bb33\") " Mar 16 00:22:14 crc kubenswrapper[4816]: I0316 00:22:14.635722 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wqn6m\" (UniqueName: \"kubernetes.io/projected/35d36436-ca87-48ef-9a68-484c2335bb33-kube-api-access-wqn6m\") pod \"35d36436-ca87-48ef-9a68-484c2335bb33\" (UID: \"35d36436-ca87-48ef-9a68-484c2335bb33\") " Mar 16 00:22:14 crc kubenswrapper[4816]: I0316 00:22:14.635801 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/35d36436-ca87-48ef-9a68-484c2335bb33-bundle\") pod \"35d36436-ca87-48ef-9a68-484c2335bb33\" (UID: \"35d36436-ca87-48ef-9a68-484c2335bb33\") " Mar 16 00:22:14 crc kubenswrapper[4816]: I0316 00:22:14.639032 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/35d36436-ca87-48ef-9a68-484c2335bb33-bundle" (OuterVolumeSpecName: "bundle") pod "35d36436-ca87-48ef-9a68-484c2335bb33" (UID: "35d36436-ca87-48ef-9a68-484c2335bb33"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 16 00:22:14 crc kubenswrapper[4816]: I0316 00:22:14.643714 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/35d36436-ca87-48ef-9a68-484c2335bb33-kube-api-access-wqn6m" (OuterVolumeSpecName: "kube-api-access-wqn6m") pod "35d36436-ca87-48ef-9a68-484c2335bb33" (UID: "35d36436-ca87-48ef-9a68-484c2335bb33"). InnerVolumeSpecName "kube-api-access-wqn6m". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:22:14 crc kubenswrapper[4816]: I0316 00:22:14.657292 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/35d36436-ca87-48ef-9a68-484c2335bb33-util" (OuterVolumeSpecName: "util") pod "35d36436-ca87-48ef-9a68-484c2335bb33" (UID: "35d36436-ca87-48ef-9a68-484c2335bb33"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 16 00:22:14 crc kubenswrapper[4816]: I0316 00:22:14.737221 4816 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/35d36436-ca87-48ef-9a68-484c2335bb33-bundle\") on node \"crc\" DevicePath \"\"" Mar 16 00:22:14 crc kubenswrapper[4816]: I0316 00:22:14.737250 4816 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/35d36436-ca87-48ef-9a68-484c2335bb33-util\") on node \"crc\" DevicePath \"\"" Mar 16 00:22:14 crc kubenswrapper[4816]: I0316 00:22:14.737259 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wqn6m\" (UniqueName: \"kubernetes.io/projected/35d36436-ca87-48ef-9a68-484c2335bb33-kube-api-access-wqn6m\") on node \"crc\" DevicePath \"\"" Mar 16 00:22:15 crc kubenswrapper[4816]: I0316 00:22:15.202921 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f085xqb4" event={"ID":"35d36436-ca87-48ef-9a68-484c2335bb33","Type":"ContainerDied","Data":"b2e666dffac4d36453ae492f3bd8ee6e9e91e9b6a649968caa1dcade8b909136"} Mar 16 00:22:15 crc kubenswrapper[4816]: I0316 00:22:15.203232 4816 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b2e666dffac4d36453ae492f3bd8ee6e9e91e9b6a649968caa1dcade8b909136" Mar 16 00:22:15 crc kubenswrapper[4816]: I0316 00:22:15.203006 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f085xqb4" Mar 16 00:22:15 crc kubenswrapper[4816]: I0316 00:22:15.207184 4816 generic.go:334] "Generic (PLEG): container finished" podID="449b2c21-4396-4d46-af73-e670b282f831" containerID="6675088a9bc77bc6e858928c42b983a7c8adb51e5f6f5dc372c160b6c32fce59" exitCode=0 Mar 16 00:22:15 crc kubenswrapper[4816]: I0316 00:22:15.207233 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-284nb" event={"ID":"449b2c21-4396-4d46-af73-e670b282f831","Type":"ContainerDied","Data":"6675088a9bc77bc6e858928c42b983a7c8adb51e5f6f5dc372c160b6c32fce59"} Mar 16 00:22:15 crc kubenswrapper[4816]: I0316 00:22:15.219352 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39exnc6z"] Mar 16 00:22:15 crc kubenswrapper[4816]: E0316 00:22:15.219707 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="35d36436-ca87-48ef-9a68-484c2335bb33" containerName="util" Mar 16 00:22:15 crc kubenswrapper[4816]: I0316 00:22:15.219735 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="35d36436-ca87-48ef-9a68-484c2335bb33" containerName="util" Mar 16 00:22:15 crc kubenswrapper[4816]: E0316 00:22:15.219758 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="35d36436-ca87-48ef-9a68-484c2335bb33" containerName="extract" Mar 16 00:22:15 crc kubenswrapper[4816]: I0316 00:22:15.219767 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="35d36436-ca87-48ef-9a68-484c2335bb33" containerName="extract" Mar 16 00:22:15 crc kubenswrapper[4816]: E0316 00:22:15.219780 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="35d36436-ca87-48ef-9a68-484c2335bb33" containerName="pull" Mar 16 00:22:15 crc kubenswrapper[4816]: I0316 00:22:15.219788 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="35d36436-ca87-48ef-9a68-484c2335bb33" containerName="pull" Mar 16 00:22:15 crc kubenswrapper[4816]: I0316 00:22:15.219914 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="35d36436-ca87-48ef-9a68-484c2335bb33" containerName="extract" Mar 16 00:22:15 crc kubenswrapper[4816]: I0316 00:22:15.221066 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39exnc6z" Mar 16 00:22:15 crc kubenswrapper[4816]: I0316 00:22:15.223012 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Mar 16 00:22:15 crc kubenswrapper[4816]: I0316 00:22:15.229421 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39exnc6z"] Mar 16 00:22:15 crc kubenswrapper[4816]: I0316 00:22:15.343531 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/3e8bb6e1-f8fd-4484-ba21-a2d5f80f0d1c-bundle\") pod \"7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39exnc6z\" (UID: \"3e8bb6e1-f8fd-4484-ba21-a2d5f80f0d1c\") " pod="openshift-marketplace/7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39exnc6z" Mar 16 00:22:15 crc kubenswrapper[4816]: I0316 00:22:15.343882 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v2nxj\" (UniqueName: \"kubernetes.io/projected/3e8bb6e1-f8fd-4484-ba21-a2d5f80f0d1c-kube-api-access-v2nxj\") pod \"7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39exnc6z\" (UID: \"3e8bb6e1-f8fd-4484-ba21-a2d5f80f0d1c\") " pod="openshift-marketplace/7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39exnc6z" Mar 16 00:22:15 crc kubenswrapper[4816]: I0316 00:22:15.343920 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/3e8bb6e1-f8fd-4484-ba21-a2d5f80f0d1c-util\") pod \"7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39exnc6z\" (UID: \"3e8bb6e1-f8fd-4484-ba21-a2d5f80f0d1c\") " pod="openshift-marketplace/7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39exnc6z" Mar 16 00:22:15 crc kubenswrapper[4816]: I0316 00:22:15.444624 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v2nxj\" (UniqueName: \"kubernetes.io/projected/3e8bb6e1-f8fd-4484-ba21-a2d5f80f0d1c-kube-api-access-v2nxj\") pod \"7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39exnc6z\" (UID: \"3e8bb6e1-f8fd-4484-ba21-a2d5f80f0d1c\") " pod="openshift-marketplace/7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39exnc6z" Mar 16 00:22:15 crc kubenswrapper[4816]: I0316 00:22:15.444826 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/3e8bb6e1-f8fd-4484-ba21-a2d5f80f0d1c-util\") pod \"7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39exnc6z\" (UID: \"3e8bb6e1-f8fd-4484-ba21-a2d5f80f0d1c\") " pod="openshift-marketplace/7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39exnc6z" Mar 16 00:22:15 crc kubenswrapper[4816]: I0316 00:22:15.444903 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/3e8bb6e1-f8fd-4484-ba21-a2d5f80f0d1c-bundle\") pod \"7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39exnc6z\" (UID: \"3e8bb6e1-f8fd-4484-ba21-a2d5f80f0d1c\") " pod="openshift-marketplace/7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39exnc6z" Mar 16 00:22:15 crc kubenswrapper[4816]: I0316 00:22:15.445610 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/3e8bb6e1-f8fd-4484-ba21-a2d5f80f0d1c-bundle\") pod \"7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39exnc6z\" (UID: \"3e8bb6e1-f8fd-4484-ba21-a2d5f80f0d1c\") " pod="openshift-marketplace/7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39exnc6z" Mar 16 00:22:15 crc kubenswrapper[4816]: I0316 00:22:15.445754 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/3e8bb6e1-f8fd-4484-ba21-a2d5f80f0d1c-util\") pod \"7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39exnc6z\" (UID: \"3e8bb6e1-f8fd-4484-ba21-a2d5f80f0d1c\") " pod="openshift-marketplace/7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39exnc6z" Mar 16 00:22:15 crc kubenswrapper[4816]: I0316 00:22:15.471688 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v2nxj\" (UniqueName: \"kubernetes.io/projected/3e8bb6e1-f8fd-4484-ba21-a2d5f80f0d1c-kube-api-access-v2nxj\") pod \"7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39exnc6z\" (UID: \"3e8bb6e1-f8fd-4484-ba21-a2d5f80f0d1c\") " pod="openshift-marketplace/7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39exnc6z" Mar 16 00:22:15 crc kubenswrapper[4816]: I0316 00:22:15.539592 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39exnc6z" Mar 16 00:22:15 crc kubenswrapper[4816]: I0316 00:22:15.756703 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39exnc6z"] Mar 16 00:22:15 crc kubenswrapper[4816]: W0316 00:22:15.760613 4816 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3e8bb6e1_f8fd_4484_ba21_a2d5f80f0d1c.slice/crio-37c72f8cb01feb6ceb4fe0acb9fb6503fe2ed23ead252cb349195e1ac79b8d98 WatchSource:0}: Error finding container 37c72f8cb01feb6ceb4fe0acb9fb6503fe2ed23ead252cb349195e1ac79b8d98: Status 404 returned error can't find the container with id 37c72f8cb01feb6ceb4fe0acb9fb6503fe2ed23ead252cb349195e1ac79b8d98 Mar 16 00:22:16 crc kubenswrapper[4816]: I0316 00:22:16.219985 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-284nb" event={"ID":"449b2c21-4396-4d46-af73-e670b282f831","Type":"ContainerStarted","Data":"408f0d5869a150801603b166f7b7332bed6e990ef8c97b1a3a5ad99e36233861"} Mar 16 00:22:16 crc kubenswrapper[4816]: I0316 00:22:16.222798 4816 generic.go:334] "Generic (PLEG): container finished" podID="3e8bb6e1-f8fd-4484-ba21-a2d5f80f0d1c" containerID="5e2ebc35c72ed41a7d716bbf1b0af7bffa3e1ec2eac67adc6dcfa6135e61c059" exitCode=0 Mar 16 00:22:16 crc kubenswrapper[4816]: I0316 00:22:16.222844 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39exnc6z" event={"ID":"3e8bb6e1-f8fd-4484-ba21-a2d5f80f0d1c","Type":"ContainerDied","Data":"5e2ebc35c72ed41a7d716bbf1b0af7bffa3e1ec2eac67adc6dcfa6135e61c059"} Mar 16 00:22:16 crc kubenswrapper[4816]: I0316 00:22:16.222871 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39exnc6z" event={"ID":"3e8bb6e1-f8fd-4484-ba21-a2d5f80f0d1c","Type":"ContainerStarted","Data":"37c72f8cb01feb6ceb4fe0acb9fb6503fe2ed23ead252cb349195e1ac79b8d98"} Mar 16 00:22:16 crc kubenswrapper[4816]: I0316 00:22:16.233500 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fhdqd9"] Mar 16 00:22:16 crc kubenswrapper[4816]: I0316 00:22:16.238140 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fhdqd9" Mar 16 00:22:16 crc kubenswrapper[4816]: I0316 00:22:16.245187 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fhdqd9"] Mar 16 00:22:16 crc kubenswrapper[4816]: I0316 00:22:16.250196 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-284nb" podStartSLOduration=1.549956339 podStartE2EDuration="5.250167108s" podCreationTimestamp="2026-03-16 00:22:11 +0000 UTC" firstStartedPulling="2026-03-16 00:22:12.176427476 +0000 UTC m=+925.272727429" lastFinishedPulling="2026-03-16 00:22:15.876638245 +0000 UTC m=+928.972938198" observedRunningTime="2026-03-16 00:22:16.239884422 +0000 UTC m=+929.336184415" watchObservedRunningTime="2026-03-16 00:22:16.250167108 +0000 UTC m=+929.346467101" Mar 16 00:22:16 crc kubenswrapper[4816]: I0316 00:22:16.358992 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/43895212-4bba-4d69-b3eb-10f49e771de3-bundle\") pod \"6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fhdqd9\" (UID: \"43895212-4bba-4d69-b3eb-10f49e771de3\") " pod="openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fhdqd9" Mar 16 00:22:16 crc kubenswrapper[4816]: I0316 00:22:16.359106 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4qp2v\" (UniqueName: \"kubernetes.io/projected/43895212-4bba-4d69-b3eb-10f49e771de3-kube-api-access-4qp2v\") pod \"6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fhdqd9\" (UID: \"43895212-4bba-4d69-b3eb-10f49e771de3\") " pod="openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fhdqd9" Mar 16 00:22:16 crc kubenswrapper[4816]: I0316 00:22:16.359177 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/43895212-4bba-4d69-b3eb-10f49e771de3-util\") pod \"6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fhdqd9\" (UID: \"43895212-4bba-4d69-b3eb-10f49e771de3\") " pod="openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fhdqd9" Mar 16 00:22:16 crc kubenswrapper[4816]: I0316 00:22:16.460404 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4qp2v\" (UniqueName: \"kubernetes.io/projected/43895212-4bba-4d69-b3eb-10f49e771de3-kube-api-access-4qp2v\") pod \"6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fhdqd9\" (UID: \"43895212-4bba-4d69-b3eb-10f49e771de3\") " pod="openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fhdqd9" Mar 16 00:22:16 crc kubenswrapper[4816]: I0316 00:22:16.460493 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/43895212-4bba-4d69-b3eb-10f49e771de3-util\") pod \"6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fhdqd9\" (UID: \"43895212-4bba-4d69-b3eb-10f49e771de3\") " pod="openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fhdqd9" Mar 16 00:22:16 crc kubenswrapper[4816]: I0316 00:22:16.460536 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/43895212-4bba-4d69-b3eb-10f49e771de3-bundle\") pod \"6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fhdqd9\" (UID: \"43895212-4bba-4d69-b3eb-10f49e771de3\") " pod="openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fhdqd9" Mar 16 00:22:16 crc kubenswrapper[4816]: I0316 00:22:16.461109 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/43895212-4bba-4d69-b3eb-10f49e771de3-bundle\") pod \"6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fhdqd9\" (UID: \"43895212-4bba-4d69-b3eb-10f49e771de3\") " pod="openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fhdqd9" Mar 16 00:22:16 crc kubenswrapper[4816]: I0316 00:22:16.461373 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/43895212-4bba-4d69-b3eb-10f49e771de3-util\") pod \"6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fhdqd9\" (UID: \"43895212-4bba-4d69-b3eb-10f49e771de3\") " pod="openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fhdqd9" Mar 16 00:22:16 crc kubenswrapper[4816]: I0316 00:22:16.484041 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4qp2v\" (UniqueName: \"kubernetes.io/projected/43895212-4bba-4d69-b3eb-10f49e771de3-kube-api-access-4qp2v\") pod \"6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fhdqd9\" (UID: \"43895212-4bba-4d69-b3eb-10f49e771de3\") " pod="openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fhdqd9" Mar 16 00:22:16 crc kubenswrapper[4816]: I0316 00:22:16.555349 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fhdqd9" Mar 16 00:22:16 crc kubenswrapper[4816]: I0316 00:22:16.734932 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fhdqd9"] Mar 16 00:22:17 crc kubenswrapper[4816]: I0316 00:22:17.232894 4816 generic.go:334] "Generic (PLEG): container finished" podID="43895212-4bba-4d69-b3eb-10f49e771de3" containerID="25e48dfc32cfb5fae27e0f685f011357f858ff603b6c0a57b3df0dd8b97c8d60" exitCode=0 Mar 16 00:22:17 crc kubenswrapper[4816]: I0316 00:22:17.233007 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fhdqd9" event={"ID":"43895212-4bba-4d69-b3eb-10f49e771de3","Type":"ContainerDied","Data":"25e48dfc32cfb5fae27e0f685f011357f858ff603b6c0a57b3df0dd8b97c8d60"} Mar 16 00:22:17 crc kubenswrapper[4816]: I0316 00:22:17.233214 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fhdqd9" event={"ID":"43895212-4bba-4d69-b3eb-10f49e771de3","Type":"ContainerStarted","Data":"bc98dd0678141051e979632a0423832d0f036d7e8d23226dff5b4e233bb6610e"} Mar 16 00:22:17 crc kubenswrapper[4816]: I0316 00:22:17.237997 4816 generic.go:334] "Generic (PLEG): container finished" podID="3e8bb6e1-f8fd-4484-ba21-a2d5f80f0d1c" containerID="273c9ff624d959a6aab76f887c0b36554d89ab051a50554133a0618a2408c4ff" exitCode=0 Mar 16 00:22:17 crc kubenswrapper[4816]: I0316 00:22:17.238106 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39exnc6z" event={"ID":"3e8bb6e1-f8fd-4484-ba21-a2d5f80f0d1c","Type":"ContainerDied","Data":"273c9ff624d959a6aab76f887c0b36554d89ab051a50554133a0618a2408c4ff"} Mar 16 00:22:18 crc kubenswrapper[4816]: I0316 00:22:18.246799 4816 generic.go:334] "Generic (PLEG): container finished" podID="3e8bb6e1-f8fd-4484-ba21-a2d5f80f0d1c" containerID="c7254f1bc611c01cc414758993eb7cab5b4e6b7c3657115ba8dad1b2b02641ea" exitCode=0 Mar 16 00:22:18 crc kubenswrapper[4816]: I0316 00:22:18.246854 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39exnc6z" event={"ID":"3e8bb6e1-f8fd-4484-ba21-a2d5f80f0d1c","Type":"ContainerDied","Data":"c7254f1bc611c01cc414758993eb7cab5b4e6b7c3657115ba8dad1b2b02641ea"} Mar 16 00:22:19 crc kubenswrapper[4816]: I0316 00:22:19.257637 4816 generic.go:334] "Generic (PLEG): container finished" podID="43895212-4bba-4d69-b3eb-10f49e771de3" containerID="846f4dc07d75a6d38979bd0bcefc6e0f7956ffb9542062ed83347ce157270bfd" exitCode=0 Mar 16 00:22:19 crc kubenswrapper[4816]: I0316 00:22:19.257740 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fhdqd9" event={"ID":"43895212-4bba-4d69-b3eb-10f49e771de3","Type":"ContainerDied","Data":"846f4dc07d75a6d38979bd0bcefc6e0f7956ffb9542062ed83347ce157270bfd"} Mar 16 00:22:19 crc kubenswrapper[4816]: I0316 00:22:19.653760 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39exnc6z" Mar 16 00:22:19 crc kubenswrapper[4816]: I0316 00:22:19.803566 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v2nxj\" (UniqueName: \"kubernetes.io/projected/3e8bb6e1-f8fd-4484-ba21-a2d5f80f0d1c-kube-api-access-v2nxj\") pod \"3e8bb6e1-f8fd-4484-ba21-a2d5f80f0d1c\" (UID: \"3e8bb6e1-f8fd-4484-ba21-a2d5f80f0d1c\") " Mar 16 00:22:19 crc kubenswrapper[4816]: I0316 00:22:19.803638 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/3e8bb6e1-f8fd-4484-ba21-a2d5f80f0d1c-util\") pod \"3e8bb6e1-f8fd-4484-ba21-a2d5f80f0d1c\" (UID: \"3e8bb6e1-f8fd-4484-ba21-a2d5f80f0d1c\") " Mar 16 00:22:19 crc kubenswrapper[4816]: I0316 00:22:19.803671 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/3e8bb6e1-f8fd-4484-ba21-a2d5f80f0d1c-bundle\") pod \"3e8bb6e1-f8fd-4484-ba21-a2d5f80f0d1c\" (UID: \"3e8bb6e1-f8fd-4484-ba21-a2d5f80f0d1c\") " Mar 16 00:22:19 crc kubenswrapper[4816]: I0316 00:22:19.805362 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3e8bb6e1-f8fd-4484-ba21-a2d5f80f0d1c-bundle" (OuterVolumeSpecName: "bundle") pod "3e8bb6e1-f8fd-4484-ba21-a2d5f80f0d1c" (UID: "3e8bb6e1-f8fd-4484-ba21-a2d5f80f0d1c"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 16 00:22:19 crc kubenswrapper[4816]: I0316 00:22:19.813700 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3e8bb6e1-f8fd-4484-ba21-a2d5f80f0d1c-kube-api-access-v2nxj" (OuterVolumeSpecName: "kube-api-access-v2nxj") pod "3e8bb6e1-f8fd-4484-ba21-a2d5f80f0d1c" (UID: "3e8bb6e1-f8fd-4484-ba21-a2d5f80f0d1c"). InnerVolumeSpecName "kube-api-access-v2nxj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:22:19 crc kubenswrapper[4816]: I0316 00:22:19.826821 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3e8bb6e1-f8fd-4484-ba21-a2d5f80f0d1c-util" (OuterVolumeSpecName: "util") pod "3e8bb6e1-f8fd-4484-ba21-a2d5f80f0d1c" (UID: "3e8bb6e1-f8fd-4484-ba21-a2d5f80f0d1c"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 16 00:22:19 crc kubenswrapper[4816]: I0316 00:22:19.905498 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v2nxj\" (UniqueName: \"kubernetes.io/projected/3e8bb6e1-f8fd-4484-ba21-a2d5f80f0d1c-kube-api-access-v2nxj\") on node \"crc\" DevicePath \"\"" Mar 16 00:22:19 crc kubenswrapper[4816]: I0316 00:22:19.905533 4816 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/3e8bb6e1-f8fd-4484-ba21-a2d5f80f0d1c-util\") on node \"crc\" DevicePath \"\"" Mar 16 00:22:19 crc kubenswrapper[4816]: I0316 00:22:19.905558 4816 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/3e8bb6e1-f8fd-4484-ba21-a2d5f80f0d1c-bundle\") on node \"crc\" DevicePath \"\"" Mar 16 00:22:20 crc kubenswrapper[4816]: I0316 00:22:20.182014 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-jjrq7"] Mar 16 00:22:20 crc kubenswrapper[4816]: E0316 00:22:20.182256 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3e8bb6e1-f8fd-4484-ba21-a2d5f80f0d1c" containerName="pull" Mar 16 00:22:20 crc kubenswrapper[4816]: I0316 00:22:20.182274 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="3e8bb6e1-f8fd-4484-ba21-a2d5f80f0d1c" containerName="pull" Mar 16 00:22:20 crc kubenswrapper[4816]: E0316 00:22:20.182291 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3e8bb6e1-f8fd-4484-ba21-a2d5f80f0d1c" containerName="util" Mar 16 00:22:20 crc kubenswrapper[4816]: I0316 00:22:20.182298 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="3e8bb6e1-f8fd-4484-ba21-a2d5f80f0d1c" containerName="util" Mar 16 00:22:20 crc kubenswrapper[4816]: E0316 00:22:20.182344 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3e8bb6e1-f8fd-4484-ba21-a2d5f80f0d1c" containerName="extract" Mar 16 00:22:20 crc kubenswrapper[4816]: I0316 00:22:20.182353 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="3e8bb6e1-f8fd-4484-ba21-a2d5f80f0d1c" containerName="extract" Mar 16 00:22:20 crc kubenswrapper[4816]: I0316 00:22:20.182472 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="3e8bb6e1-f8fd-4484-ba21-a2d5f80f0d1c" containerName="extract" Mar 16 00:22:20 crc kubenswrapper[4816]: I0316 00:22:20.183421 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-jjrq7" Mar 16 00:22:20 crc kubenswrapper[4816]: I0316 00:22:20.237000 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-jjrq7"] Mar 16 00:22:20 crc kubenswrapper[4816]: I0316 00:22:20.265059 4816 generic.go:334] "Generic (PLEG): container finished" podID="43895212-4bba-4d69-b3eb-10f49e771de3" containerID="9753285d470f349be7b55d609e2dfe0e40687af01b3d1e9210904b74dc0363d8" exitCode=0 Mar 16 00:22:20 crc kubenswrapper[4816]: I0316 00:22:20.265120 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fhdqd9" event={"ID":"43895212-4bba-4d69-b3eb-10f49e771de3","Type":"ContainerDied","Data":"9753285d470f349be7b55d609e2dfe0e40687af01b3d1e9210904b74dc0363d8"} Mar 16 00:22:20 crc kubenswrapper[4816]: I0316 00:22:20.267015 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39exnc6z" event={"ID":"3e8bb6e1-f8fd-4484-ba21-a2d5f80f0d1c","Type":"ContainerDied","Data":"37c72f8cb01feb6ceb4fe0acb9fb6503fe2ed23ead252cb349195e1ac79b8d98"} Mar 16 00:22:20 crc kubenswrapper[4816]: I0316 00:22:20.267038 4816 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="37c72f8cb01feb6ceb4fe0acb9fb6503fe2ed23ead252cb349195e1ac79b8d98" Mar 16 00:22:20 crc kubenswrapper[4816]: I0316 00:22:20.267070 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39exnc6z" Mar 16 00:22:20 crc kubenswrapper[4816]: I0316 00:22:20.309739 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d502b2b7-9ca7-4b92-bb7f-d1639a6a7ef2-utilities\") pod \"certified-operators-jjrq7\" (UID: \"d502b2b7-9ca7-4b92-bb7f-d1639a6a7ef2\") " pod="openshift-marketplace/certified-operators-jjrq7" Mar 16 00:22:20 crc kubenswrapper[4816]: I0316 00:22:20.309867 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-62z8f\" (UniqueName: \"kubernetes.io/projected/d502b2b7-9ca7-4b92-bb7f-d1639a6a7ef2-kube-api-access-62z8f\") pod \"certified-operators-jjrq7\" (UID: \"d502b2b7-9ca7-4b92-bb7f-d1639a6a7ef2\") " pod="openshift-marketplace/certified-operators-jjrq7" Mar 16 00:22:20 crc kubenswrapper[4816]: I0316 00:22:20.309900 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d502b2b7-9ca7-4b92-bb7f-d1639a6a7ef2-catalog-content\") pod \"certified-operators-jjrq7\" (UID: \"d502b2b7-9ca7-4b92-bb7f-d1639a6a7ef2\") " pod="openshift-marketplace/certified-operators-jjrq7" Mar 16 00:22:20 crc kubenswrapper[4816]: I0316 00:22:20.410676 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-62z8f\" (UniqueName: \"kubernetes.io/projected/d502b2b7-9ca7-4b92-bb7f-d1639a6a7ef2-kube-api-access-62z8f\") pod \"certified-operators-jjrq7\" (UID: \"d502b2b7-9ca7-4b92-bb7f-d1639a6a7ef2\") " pod="openshift-marketplace/certified-operators-jjrq7" Mar 16 00:22:20 crc kubenswrapper[4816]: I0316 00:22:20.411279 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d502b2b7-9ca7-4b92-bb7f-d1639a6a7ef2-catalog-content\") pod \"certified-operators-jjrq7\" (UID: \"d502b2b7-9ca7-4b92-bb7f-d1639a6a7ef2\") " pod="openshift-marketplace/certified-operators-jjrq7" Mar 16 00:22:20 crc kubenswrapper[4816]: I0316 00:22:20.411434 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d502b2b7-9ca7-4b92-bb7f-d1639a6a7ef2-utilities\") pod \"certified-operators-jjrq7\" (UID: \"d502b2b7-9ca7-4b92-bb7f-d1639a6a7ef2\") " pod="openshift-marketplace/certified-operators-jjrq7" Mar 16 00:22:20 crc kubenswrapper[4816]: I0316 00:22:20.411765 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d502b2b7-9ca7-4b92-bb7f-d1639a6a7ef2-catalog-content\") pod \"certified-operators-jjrq7\" (UID: \"d502b2b7-9ca7-4b92-bb7f-d1639a6a7ef2\") " pod="openshift-marketplace/certified-operators-jjrq7" Mar 16 00:22:20 crc kubenswrapper[4816]: I0316 00:22:20.411867 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d502b2b7-9ca7-4b92-bb7f-d1639a6a7ef2-utilities\") pod \"certified-operators-jjrq7\" (UID: \"d502b2b7-9ca7-4b92-bb7f-d1639a6a7ef2\") " pod="openshift-marketplace/certified-operators-jjrq7" Mar 16 00:22:20 crc kubenswrapper[4816]: I0316 00:22:20.458252 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-62z8f\" (UniqueName: \"kubernetes.io/projected/d502b2b7-9ca7-4b92-bb7f-d1639a6a7ef2-kube-api-access-62z8f\") pod \"certified-operators-jjrq7\" (UID: \"d502b2b7-9ca7-4b92-bb7f-d1639a6a7ef2\") " pod="openshift-marketplace/certified-operators-jjrq7" Mar 16 00:22:20 crc kubenswrapper[4816]: I0316 00:22:20.498210 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-jjrq7" Mar 16 00:22:21 crc kubenswrapper[4816]: I0316 00:22:21.023782 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-jjrq7"] Mar 16 00:22:21 crc kubenswrapper[4816]: W0316 00:22:21.028468 4816 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd502b2b7_9ca7_4b92_bb7f_d1639a6a7ef2.slice/crio-5750fa2beb8413e4bbf66bc2e4ec103cf698ec3743191fcfdc001a528f09e12f WatchSource:0}: Error finding container 5750fa2beb8413e4bbf66bc2e4ec103cf698ec3743191fcfdc001a528f09e12f: Status 404 returned error can't find the container with id 5750fa2beb8413e4bbf66bc2e4ec103cf698ec3743191fcfdc001a528f09e12f Mar 16 00:22:21 crc kubenswrapper[4816]: I0316 00:22:21.273926 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jjrq7" event={"ID":"d502b2b7-9ca7-4b92-bb7f-d1639a6a7ef2","Type":"ContainerStarted","Data":"e5839a641f5e4b47edb9ba28f2918214558d3b4454dab06397a37f22bd1120b7"} Mar 16 00:22:21 crc kubenswrapper[4816]: I0316 00:22:21.274269 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jjrq7" event={"ID":"d502b2b7-9ca7-4b92-bb7f-d1639a6a7ef2","Type":"ContainerStarted","Data":"5750fa2beb8413e4bbf66bc2e4ec103cf698ec3743191fcfdc001a528f09e12f"} Mar 16 00:22:21 crc kubenswrapper[4816]: I0316 00:22:21.679565 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fhdqd9" Mar 16 00:22:21 crc kubenswrapper[4816]: I0316 00:22:21.724306 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-284nb" Mar 16 00:22:21 crc kubenswrapper[4816]: I0316 00:22:21.727773 4816 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-284nb" Mar 16 00:22:21 crc kubenswrapper[4816]: I0316 00:22:21.827718 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4qp2v\" (UniqueName: \"kubernetes.io/projected/43895212-4bba-4d69-b3eb-10f49e771de3-kube-api-access-4qp2v\") pod \"43895212-4bba-4d69-b3eb-10f49e771de3\" (UID: \"43895212-4bba-4d69-b3eb-10f49e771de3\") " Mar 16 00:22:21 crc kubenswrapper[4816]: I0316 00:22:21.827992 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/43895212-4bba-4d69-b3eb-10f49e771de3-bundle\") pod \"43895212-4bba-4d69-b3eb-10f49e771de3\" (UID: \"43895212-4bba-4d69-b3eb-10f49e771de3\") " Mar 16 00:22:21 crc kubenswrapper[4816]: I0316 00:22:21.828130 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/43895212-4bba-4d69-b3eb-10f49e771de3-util\") pod \"43895212-4bba-4d69-b3eb-10f49e771de3\" (UID: \"43895212-4bba-4d69-b3eb-10f49e771de3\") " Mar 16 00:22:21 crc kubenswrapper[4816]: I0316 00:22:21.828863 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/43895212-4bba-4d69-b3eb-10f49e771de3-bundle" (OuterVolumeSpecName: "bundle") pod "43895212-4bba-4d69-b3eb-10f49e771de3" (UID: "43895212-4bba-4d69-b3eb-10f49e771de3"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 16 00:22:21 crc kubenswrapper[4816]: I0316 00:22:21.835995 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43895212-4bba-4d69-b3eb-10f49e771de3-kube-api-access-4qp2v" (OuterVolumeSpecName: "kube-api-access-4qp2v") pod "43895212-4bba-4d69-b3eb-10f49e771de3" (UID: "43895212-4bba-4d69-b3eb-10f49e771de3"). InnerVolumeSpecName "kube-api-access-4qp2v". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:22:21 crc kubenswrapper[4816]: I0316 00:22:21.866517 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/43895212-4bba-4d69-b3eb-10f49e771de3-util" (OuterVolumeSpecName: "util") pod "43895212-4bba-4d69-b3eb-10f49e771de3" (UID: "43895212-4bba-4d69-b3eb-10f49e771de3"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 16 00:22:21 crc kubenswrapper[4816]: I0316 00:22:21.929214 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4qp2v\" (UniqueName: \"kubernetes.io/projected/43895212-4bba-4d69-b3eb-10f49e771de3-kube-api-access-4qp2v\") on node \"crc\" DevicePath \"\"" Mar 16 00:22:21 crc kubenswrapper[4816]: I0316 00:22:21.929587 4816 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/43895212-4bba-4d69-b3eb-10f49e771de3-bundle\") on node \"crc\" DevicePath \"\"" Mar 16 00:22:21 crc kubenswrapper[4816]: I0316 00:22:21.929601 4816 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/43895212-4bba-4d69-b3eb-10f49e771de3-util\") on node \"crc\" DevicePath \"\"" Mar 16 00:22:22 crc kubenswrapper[4816]: I0316 00:22:22.282350 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fhdqd9" Mar 16 00:22:22 crc kubenswrapper[4816]: I0316 00:22:22.282339 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fhdqd9" event={"ID":"43895212-4bba-4d69-b3eb-10f49e771de3","Type":"ContainerDied","Data":"bc98dd0678141051e979632a0423832d0f036d7e8d23226dff5b4e233bb6610e"} Mar 16 00:22:22 crc kubenswrapper[4816]: I0316 00:22:22.282471 4816 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bc98dd0678141051e979632a0423832d0f036d7e8d23226dff5b4e233bb6610e" Mar 16 00:22:22 crc kubenswrapper[4816]: I0316 00:22:22.284160 4816 generic.go:334] "Generic (PLEG): container finished" podID="d502b2b7-9ca7-4b92-bb7f-d1639a6a7ef2" containerID="e5839a641f5e4b47edb9ba28f2918214558d3b4454dab06397a37f22bd1120b7" exitCode=0 Mar 16 00:22:22 crc kubenswrapper[4816]: I0316 00:22:22.284243 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jjrq7" event={"ID":"d502b2b7-9ca7-4b92-bb7f-d1639a6a7ef2","Type":"ContainerDied","Data":"e5839a641f5e4b47edb9ba28f2918214558d3b4454dab06397a37f22bd1120b7"} Mar 16 00:22:22 crc kubenswrapper[4816]: I0316 00:22:22.780075 4816 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-284nb" podUID="449b2c21-4396-4d46-af73-e670b282f831" containerName="registry-server" probeResult="failure" output=< Mar 16 00:22:22 crc kubenswrapper[4816]: timeout: failed to connect service ":50051" within 1s Mar 16 00:22:22 crc kubenswrapper[4816]: > Mar 16 00:22:24 crc kubenswrapper[4816]: I0316 00:22:24.295361 4816 generic.go:334] "Generic (PLEG): container finished" podID="d502b2b7-9ca7-4b92-bb7f-d1639a6a7ef2" containerID="79acb07939ccc5b80b6a155b3eb5d1de07ee6d3e3a71e7de4d918cb8ef32d1ed" exitCode=0 Mar 16 00:22:24 crc kubenswrapper[4816]: I0316 00:22:24.295469 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jjrq7" event={"ID":"d502b2b7-9ca7-4b92-bb7f-d1639a6a7ef2","Type":"ContainerDied","Data":"79acb07939ccc5b80b6a155b3eb5d1de07ee6d3e3a71e7de4d918cb8ef32d1ed"} Mar 16 00:22:24 crc kubenswrapper[4816]: I0316 00:22:24.639085 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e59v87l"] Mar 16 00:22:24 crc kubenswrapper[4816]: E0316 00:22:24.639318 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="43895212-4bba-4d69-b3eb-10f49e771de3" containerName="pull" Mar 16 00:22:24 crc kubenswrapper[4816]: I0316 00:22:24.639339 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="43895212-4bba-4d69-b3eb-10f49e771de3" containerName="pull" Mar 16 00:22:24 crc kubenswrapper[4816]: E0316 00:22:24.639363 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="43895212-4bba-4d69-b3eb-10f49e771de3" containerName="util" Mar 16 00:22:24 crc kubenswrapper[4816]: I0316 00:22:24.639372 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="43895212-4bba-4d69-b3eb-10f49e771de3" containerName="util" Mar 16 00:22:24 crc kubenswrapper[4816]: E0316 00:22:24.639381 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="43895212-4bba-4d69-b3eb-10f49e771de3" containerName="extract" Mar 16 00:22:24 crc kubenswrapper[4816]: I0316 00:22:24.639390 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="43895212-4bba-4d69-b3eb-10f49e771de3" containerName="extract" Mar 16 00:22:24 crc kubenswrapper[4816]: I0316 00:22:24.639505 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="43895212-4bba-4d69-b3eb-10f49e771de3" containerName="extract" Mar 16 00:22:24 crc kubenswrapper[4816]: I0316 00:22:24.640431 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e59v87l" Mar 16 00:22:24 crc kubenswrapper[4816]: I0316 00:22:24.643191 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Mar 16 00:22:24 crc kubenswrapper[4816]: I0316 00:22:24.650743 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e59v87l"] Mar 16 00:22:24 crc kubenswrapper[4816]: I0316 00:22:24.762526 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/1da45fda-a8cc-46c1-8831-58418ecc9819-bundle\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e59v87l\" (UID: \"1da45fda-a8cc-46c1-8831-58418ecc9819\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e59v87l" Mar 16 00:22:24 crc kubenswrapper[4816]: I0316 00:22:24.762602 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/1da45fda-a8cc-46c1-8831-58418ecc9819-util\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e59v87l\" (UID: \"1da45fda-a8cc-46c1-8831-58418ecc9819\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e59v87l" Mar 16 00:22:24 crc kubenswrapper[4816]: I0316 00:22:24.762654 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4582c\" (UniqueName: \"kubernetes.io/projected/1da45fda-a8cc-46c1-8831-58418ecc9819-kube-api-access-4582c\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e59v87l\" (UID: \"1da45fda-a8cc-46c1-8831-58418ecc9819\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e59v87l" Mar 16 00:22:24 crc kubenswrapper[4816]: I0316 00:22:24.846221 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-68bc856cb9-tfv44"] Mar 16 00:22:24 crc kubenswrapper[4816]: I0316 00:22:24.846834 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-tfv44" Mar 16 00:22:24 crc kubenswrapper[4816]: I0316 00:22:24.849426 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators"/"openshift-service-ca.crt" Mar 16 00:22:24 crc kubenswrapper[4816]: I0316 00:22:24.849888 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-dockercfg-ft6r8" Mar 16 00:22:24 crc kubenswrapper[4816]: I0316 00:22:24.851591 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators"/"kube-root-ca.crt" Mar 16 00:22:24 crc kubenswrapper[4816]: I0316 00:22:24.857374 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-68bc856cb9-tfv44"] Mar 16 00:22:24 crc kubenswrapper[4816]: I0316 00:22:24.863659 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4582c\" (UniqueName: \"kubernetes.io/projected/1da45fda-a8cc-46c1-8831-58418ecc9819-kube-api-access-4582c\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e59v87l\" (UID: \"1da45fda-a8cc-46c1-8831-58418ecc9819\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e59v87l" Mar 16 00:22:24 crc kubenswrapper[4816]: I0316 00:22:24.863719 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/1da45fda-a8cc-46c1-8831-58418ecc9819-bundle\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e59v87l\" (UID: \"1da45fda-a8cc-46c1-8831-58418ecc9819\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e59v87l" Mar 16 00:22:24 crc kubenswrapper[4816]: I0316 00:22:24.863765 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/1da45fda-a8cc-46c1-8831-58418ecc9819-util\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e59v87l\" (UID: \"1da45fda-a8cc-46c1-8831-58418ecc9819\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e59v87l" Mar 16 00:22:24 crc kubenswrapper[4816]: I0316 00:22:24.864273 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/1da45fda-a8cc-46c1-8831-58418ecc9819-util\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e59v87l\" (UID: \"1da45fda-a8cc-46c1-8831-58418ecc9819\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e59v87l" Mar 16 00:22:24 crc kubenswrapper[4816]: I0316 00:22:24.864392 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/1da45fda-a8cc-46c1-8831-58418ecc9819-bundle\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e59v87l\" (UID: \"1da45fda-a8cc-46c1-8831-58418ecc9819\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e59v87l" Mar 16 00:22:24 crc kubenswrapper[4816]: I0316 00:22:24.896429 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4582c\" (UniqueName: \"kubernetes.io/projected/1da45fda-a8cc-46c1-8831-58418ecc9819-kube-api-access-4582c\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e59v87l\" (UID: \"1da45fda-a8cc-46c1-8831-58418ecc9819\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e59v87l" Mar 16 00:22:24 crc kubenswrapper[4816]: I0316 00:22:24.952833 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e59v87l" Mar 16 00:22:24 crc kubenswrapper[4816]: I0316 00:22:24.965431 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x5nfp\" (UniqueName: \"kubernetes.io/projected/562f24fe-5c4c-4540-96ae-6e01f539141b-kube-api-access-x5nfp\") pod \"obo-prometheus-operator-68bc856cb9-tfv44\" (UID: \"562f24fe-5c4c-4540-96ae-6e01f539141b\") " pod="openshift-operators/obo-prometheus-operator-68bc856cb9-tfv44" Mar 16 00:22:24 crc kubenswrapper[4816]: I0316 00:22:24.986837 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-6d8d794bc-2s9sk"] Mar 16 00:22:24 crc kubenswrapper[4816]: I0316 00:22:24.987485 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-6d8d794bc-2s9sk" Mar 16 00:22:24 crc kubenswrapper[4816]: I0316 00:22:24.990819 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-admission-webhook-dockercfg-wqns4" Mar 16 00:22:24 crc kubenswrapper[4816]: I0316 00:22:24.991090 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-admission-webhook-service-cert" Mar 16 00:22:24 crc kubenswrapper[4816]: I0316 00:22:24.996958 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-6d8d794bc-6bdkn"] Mar 16 00:22:24 crc kubenswrapper[4816]: I0316 00:22:24.997654 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-6d8d794bc-6bdkn" Mar 16 00:22:25 crc kubenswrapper[4816]: I0316 00:22:25.004294 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-6d8d794bc-2s9sk"] Mar 16 00:22:25 crc kubenswrapper[4816]: I0316 00:22:25.023651 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-6d8d794bc-6bdkn"] Mar 16 00:22:25 crc kubenswrapper[4816]: I0316 00:22:25.066430 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x5nfp\" (UniqueName: \"kubernetes.io/projected/562f24fe-5c4c-4540-96ae-6e01f539141b-kube-api-access-x5nfp\") pod \"obo-prometheus-operator-68bc856cb9-tfv44\" (UID: \"562f24fe-5c4c-4540-96ae-6e01f539141b\") " pod="openshift-operators/obo-prometheus-operator-68bc856cb9-tfv44" Mar 16 00:22:25 crc kubenswrapper[4816]: I0316 00:22:25.089625 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x5nfp\" (UniqueName: \"kubernetes.io/projected/562f24fe-5c4c-4540-96ae-6e01f539141b-kube-api-access-x5nfp\") pod \"obo-prometheus-operator-68bc856cb9-tfv44\" (UID: \"562f24fe-5c4c-4540-96ae-6e01f539141b\") " pod="openshift-operators/obo-prometheus-operator-68bc856cb9-tfv44" Mar 16 00:22:25 crc kubenswrapper[4816]: I0316 00:22:25.162322 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-tfv44" Mar 16 00:22:25 crc kubenswrapper[4816]: I0316 00:22:25.167517 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/9a808114-3164-4abe-a481-1b5d3b9df2a0-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-6d8d794bc-2s9sk\" (UID: \"9a808114-3164-4abe-a481-1b5d3b9df2a0\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-6d8d794bc-2s9sk" Mar 16 00:22:25 crc kubenswrapper[4816]: I0316 00:22:25.167586 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/9a808114-3164-4abe-a481-1b5d3b9df2a0-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-6d8d794bc-2s9sk\" (UID: \"9a808114-3164-4abe-a481-1b5d3b9df2a0\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-6d8d794bc-2s9sk" Mar 16 00:22:25 crc kubenswrapper[4816]: I0316 00:22:25.167614 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/36951342-3370-4291-baa3-2612f64036fd-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-6d8d794bc-6bdkn\" (UID: \"36951342-3370-4291-baa3-2612f64036fd\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-6d8d794bc-6bdkn" Mar 16 00:22:25 crc kubenswrapper[4816]: I0316 00:22:25.167651 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/36951342-3370-4291-baa3-2612f64036fd-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-6d8d794bc-6bdkn\" (UID: \"36951342-3370-4291-baa3-2612f64036fd\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-6d8d794bc-6bdkn" Mar 16 00:22:25 crc kubenswrapper[4816]: I0316 00:22:25.197154 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/observability-operator-59bdc8b94-w6wv7"] Mar 16 00:22:25 crc kubenswrapper[4816]: I0316 00:22:25.197990 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-59bdc8b94-w6wv7" Mar 16 00:22:25 crc kubenswrapper[4816]: I0316 00:22:25.208753 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"observability-operator-sa-dockercfg-m8rpm" Mar 16 00:22:25 crc kubenswrapper[4816]: I0316 00:22:25.208984 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"observability-operator-tls" Mar 16 00:22:25 crc kubenswrapper[4816]: I0316 00:22:25.223009 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/observability-operator-59bdc8b94-w6wv7"] Mar 16 00:22:25 crc kubenswrapper[4816]: I0316 00:22:25.271595 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/9a808114-3164-4abe-a481-1b5d3b9df2a0-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-6d8d794bc-2s9sk\" (UID: \"9a808114-3164-4abe-a481-1b5d3b9df2a0\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-6d8d794bc-2s9sk" Mar 16 00:22:25 crc kubenswrapper[4816]: I0316 00:22:25.271681 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/9a808114-3164-4abe-a481-1b5d3b9df2a0-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-6d8d794bc-2s9sk\" (UID: \"9a808114-3164-4abe-a481-1b5d3b9df2a0\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-6d8d794bc-2s9sk" Mar 16 00:22:25 crc kubenswrapper[4816]: I0316 00:22:25.271721 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/36951342-3370-4291-baa3-2612f64036fd-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-6d8d794bc-6bdkn\" (UID: \"36951342-3370-4291-baa3-2612f64036fd\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-6d8d794bc-6bdkn" Mar 16 00:22:25 crc kubenswrapper[4816]: I0316 00:22:25.271768 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/36951342-3370-4291-baa3-2612f64036fd-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-6d8d794bc-6bdkn\" (UID: \"36951342-3370-4291-baa3-2612f64036fd\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-6d8d794bc-6bdkn" Mar 16 00:22:25 crc kubenswrapper[4816]: I0316 00:22:25.280890 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/9a808114-3164-4abe-a481-1b5d3b9df2a0-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-6d8d794bc-2s9sk\" (UID: \"9a808114-3164-4abe-a481-1b5d3b9df2a0\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-6d8d794bc-2s9sk" Mar 16 00:22:25 crc kubenswrapper[4816]: I0316 00:22:25.280912 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/36951342-3370-4291-baa3-2612f64036fd-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-6d8d794bc-6bdkn\" (UID: \"36951342-3370-4291-baa3-2612f64036fd\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-6d8d794bc-6bdkn" Mar 16 00:22:25 crc kubenswrapper[4816]: I0316 00:22:25.280913 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/9a808114-3164-4abe-a481-1b5d3b9df2a0-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-6d8d794bc-2s9sk\" (UID: \"9a808114-3164-4abe-a481-1b5d3b9df2a0\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-6d8d794bc-2s9sk" Mar 16 00:22:25 crc kubenswrapper[4816]: I0316 00:22:25.283694 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/36951342-3370-4291-baa3-2612f64036fd-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-6d8d794bc-6bdkn\" (UID: \"36951342-3370-4291-baa3-2612f64036fd\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-6d8d794bc-6bdkn" Mar 16 00:22:25 crc kubenswrapper[4816]: I0316 00:22:25.296285 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e59v87l"] Mar 16 00:22:25 crc kubenswrapper[4816]: I0316 00:22:25.321768 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jjrq7" event={"ID":"d502b2b7-9ca7-4b92-bb7f-d1639a6a7ef2","Type":"ContainerStarted","Data":"f43a292651495e3c7ec54a104d349e2f6851097dd5f8683239c8c438dec4317a"} Mar 16 00:22:25 crc kubenswrapper[4816]: I0316 00:22:25.347496 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-6d8d794bc-2s9sk" Mar 16 00:22:25 crc kubenswrapper[4816]: I0316 00:22:25.347577 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-jjrq7" podStartSLOduration=2.930626549 podStartE2EDuration="5.347541967s" podCreationTimestamp="2026-03-16 00:22:20 +0000 UTC" firstStartedPulling="2026-03-16 00:22:22.286508067 +0000 UTC m=+935.382808020" lastFinishedPulling="2026-03-16 00:22:24.703423485 +0000 UTC m=+937.799723438" observedRunningTime="2026-03-16 00:22:25.344016846 +0000 UTC m=+938.440316799" watchObservedRunningTime="2026-03-16 00:22:25.347541967 +0000 UTC m=+938.443841920" Mar 16 00:22:25 crc kubenswrapper[4816]: I0316 00:22:25.361851 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-6d8d794bc-6bdkn" Mar 16 00:22:25 crc kubenswrapper[4816]: I0316 00:22:25.379241 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mz5bq\" (UniqueName: \"kubernetes.io/projected/8d0f60fa-8d26-43ea-a680-1d3a92dd270d-kube-api-access-mz5bq\") pod \"observability-operator-59bdc8b94-w6wv7\" (UID: \"8d0f60fa-8d26-43ea-a680-1d3a92dd270d\") " pod="openshift-operators/observability-operator-59bdc8b94-w6wv7" Mar 16 00:22:25 crc kubenswrapper[4816]: I0316 00:22:25.379326 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/8d0f60fa-8d26-43ea-a680-1d3a92dd270d-observability-operator-tls\") pod \"observability-operator-59bdc8b94-w6wv7\" (UID: \"8d0f60fa-8d26-43ea-a680-1d3a92dd270d\") " pod="openshift-operators/observability-operator-59bdc8b94-w6wv7" Mar 16 00:22:25 crc kubenswrapper[4816]: I0316 00:22:25.396830 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/perses-operator-5bf474d74f-t7w7m"] Mar 16 00:22:25 crc kubenswrapper[4816]: I0316 00:22:25.397717 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-5bf474d74f-t7w7m" Mar 16 00:22:25 crc kubenswrapper[4816]: I0316 00:22:25.406537 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"perses-operator-dockercfg-mjv48" Mar 16 00:22:25 crc kubenswrapper[4816]: I0316 00:22:25.433259 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/perses-operator-5bf474d74f-t7w7m"] Mar 16 00:22:25 crc kubenswrapper[4816]: I0316 00:22:25.486180 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mz5bq\" (UniqueName: \"kubernetes.io/projected/8d0f60fa-8d26-43ea-a680-1d3a92dd270d-kube-api-access-mz5bq\") pod \"observability-operator-59bdc8b94-w6wv7\" (UID: \"8d0f60fa-8d26-43ea-a680-1d3a92dd270d\") " pod="openshift-operators/observability-operator-59bdc8b94-w6wv7" Mar 16 00:22:25 crc kubenswrapper[4816]: I0316 00:22:25.486282 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/8d0f60fa-8d26-43ea-a680-1d3a92dd270d-observability-operator-tls\") pod \"observability-operator-59bdc8b94-w6wv7\" (UID: \"8d0f60fa-8d26-43ea-a680-1d3a92dd270d\") " pod="openshift-operators/observability-operator-59bdc8b94-w6wv7" Mar 16 00:22:25 crc kubenswrapper[4816]: I0316 00:22:25.491194 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/8d0f60fa-8d26-43ea-a680-1d3a92dd270d-observability-operator-tls\") pod \"observability-operator-59bdc8b94-w6wv7\" (UID: \"8d0f60fa-8d26-43ea-a680-1d3a92dd270d\") " pod="openshift-operators/observability-operator-59bdc8b94-w6wv7" Mar 16 00:22:25 crc kubenswrapper[4816]: I0316 00:22:25.509317 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mz5bq\" (UniqueName: \"kubernetes.io/projected/8d0f60fa-8d26-43ea-a680-1d3a92dd270d-kube-api-access-mz5bq\") pod \"observability-operator-59bdc8b94-w6wv7\" (UID: \"8d0f60fa-8d26-43ea-a680-1d3a92dd270d\") " pod="openshift-operators/observability-operator-59bdc8b94-w6wv7" Mar 16 00:22:25 crc kubenswrapper[4816]: I0316 00:22:25.545862 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-59bdc8b94-w6wv7" Mar 16 00:22:25 crc kubenswrapper[4816]: I0316 00:22:25.569044 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-68bc856cb9-tfv44"] Mar 16 00:22:25 crc kubenswrapper[4816]: I0316 00:22:25.589096 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/f24959c1-f57f-4bf6-8a55-c8a35173ff8b-openshift-service-ca\") pod \"perses-operator-5bf474d74f-t7w7m\" (UID: \"f24959c1-f57f-4bf6-8a55-c8a35173ff8b\") " pod="openshift-operators/perses-operator-5bf474d74f-t7w7m" Mar 16 00:22:25 crc kubenswrapper[4816]: I0316 00:22:25.589179 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9g45v\" (UniqueName: \"kubernetes.io/projected/f24959c1-f57f-4bf6-8a55-c8a35173ff8b-kube-api-access-9g45v\") pod \"perses-operator-5bf474d74f-t7w7m\" (UID: \"f24959c1-f57f-4bf6-8a55-c8a35173ff8b\") " pod="openshift-operators/perses-operator-5bf474d74f-t7w7m" Mar 16 00:22:25 crc kubenswrapper[4816]: I0316 00:22:25.689072 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-6d8d794bc-2s9sk"] Mar 16 00:22:25 crc kubenswrapper[4816]: I0316 00:22:25.690205 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9g45v\" (UniqueName: \"kubernetes.io/projected/f24959c1-f57f-4bf6-8a55-c8a35173ff8b-kube-api-access-9g45v\") pod \"perses-operator-5bf474d74f-t7w7m\" (UID: \"f24959c1-f57f-4bf6-8a55-c8a35173ff8b\") " pod="openshift-operators/perses-operator-5bf474d74f-t7w7m" Mar 16 00:22:25 crc kubenswrapper[4816]: I0316 00:22:25.690288 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/f24959c1-f57f-4bf6-8a55-c8a35173ff8b-openshift-service-ca\") pod \"perses-operator-5bf474d74f-t7w7m\" (UID: \"f24959c1-f57f-4bf6-8a55-c8a35173ff8b\") " pod="openshift-operators/perses-operator-5bf474d74f-t7w7m" Mar 16 00:22:25 crc kubenswrapper[4816]: I0316 00:22:25.691499 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/f24959c1-f57f-4bf6-8a55-c8a35173ff8b-openshift-service-ca\") pod \"perses-operator-5bf474d74f-t7w7m\" (UID: \"f24959c1-f57f-4bf6-8a55-c8a35173ff8b\") " pod="openshift-operators/perses-operator-5bf474d74f-t7w7m" Mar 16 00:22:25 crc kubenswrapper[4816]: W0316 00:22:25.714758 4816 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9a808114_3164_4abe_a481_1b5d3b9df2a0.slice/crio-b9994a1aa8a861ee26befc56024cc3fe17c6f7e7ffbbe13fd66ae13f48d231ff WatchSource:0}: Error finding container b9994a1aa8a861ee26befc56024cc3fe17c6f7e7ffbbe13fd66ae13f48d231ff: Status 404 returned error can't find the container with id b9994a1aa8a861ee26befc56024cc3fe17c6f7e7ffbbe13fd66ae13f48d231ff Mar 16 00:22:25 crc kubenswrapper[4816]: I0316 00:22:25.715465 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9g45v\" (UniqueName: \"kubernetes.io/projected/f24959c1-f57f-4bf6-8a55-c8a35173ff8b-kube-api-access-9g45v\") pod \"perses-operator-5bf474d74f-t7w7m\" (UID: \"f24959c1-f57f-4bf6-8a55-c8a35173ff8b\") " pod="openshift-operators/perses-operator-5bf474d74f-t7w7m" Mar 16 00:22:25 crc kubenswrapper[4816]: I0316 00:22:25.746416 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-5bf474d74f-t7w7m" Mar 16 00:22:25 crc kubenswrapper[4816]: I0316 00:22:25.899391 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/observability-operator-59bdc8b94-w6wv7"] Mar 16 00:22:25 crc kubenswrapper[4816]: I0316 00:22:25.979309 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-6d8d794bc-6bdkn"] Mar 16 00:22:26 crc kubenswrapper[4816]: I0316 00:22:26.150469 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/perses-operator-5bf474d74f-t7w7m"] Mar 16 00:22:26 crc kubenswrapper[4816]: W0316 00:22:26.154344 4816 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf24959c1_f57f_4bf6_8a55_c8a35173ff8b.slice/crio-81b77c3544cb92b1b3ff3f953b9590e553e22ceb2885c94f32353351ab9051c7 WatchSource:0}: Error finding container 81b77c3544cb92b1b3ff3f953b9590e553e22ceb2885c94f32353351ab9051c7: Status 404 returned error can't find the container with id 81b77c3544cb92b1b3ff3f953b9590e553e22ceb2885c94f32353351ab9051c7 Mar 16 00:22:26 crc kubenswrapper[4816]: I0316 00:22:26.329430 4816 generic.go:334] "Generic (PLEG): container finished" podID="1da45fda-a8cc-46c1-8831-58418ecc9819" containerID="c8f418e6cc25e4a0e78eae89961561eed7258b5829ebb9c402f1e3fe0c654d54" exitCode=0 Mar 16 00:22:26 crc kubenswrapper[4816]: I0316 00:22:26.329537 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e59v87l" event={"ID":"1da45fda-a8cc-46c1-8831-58418ecc9819","Type":"ContainerDied","Data":"c8f418e6cc25e4a0e78eae89961561eed7258b5829ebb9c402f1e3fe0c654d54"} Mar 16 00:22:26 crc kubenswrapper[4816]: I0316 00:22:26.329584 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e59v87l" event={"ID":"1da45fda-a8cc-46c1-8831-58418ecc9819","Type":"ContainerStarted","Data":"423ca09a07f2da9396c84b4219be8387d28d6dd64d1f4c92b01055a8dae546ea"} Mar 16 00:22:26 crc kubenswrapper[4816]: I0316 00:22:26.331137 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/perses-operator-5bf474d74f-t7w7m" event={"ID":"f24959c1-f57f-4bf6-8a55-c8a35173ff8b","Type":"ContainerStarted","Data":"81b77c3544cb92b1b3ff3f953b9590e553e22ceb2885c94f32353351ab9051c7"} Mar 16 00:22:26 crc kubenswrapper[4816]: I0316 00:22:26.332502 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-6d8d794bc-6bdkn" event={"ID":"36951342-3370-4291-baa3-2612f64036fd","Type":"ContainerStarted","Data":"b1b87b98fc34f49cfda8b008ea32b014b1f814bc1cc1fbd9cf8085c1089ce8ad"} Mar 16 00:22:26 crc kubenswrapper[4816]: I0316 00:22:26.334343 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/observability-operator-59bdc8b94-w6wv7" event={"ID":"8d0f60fa-8d26-43ea-a680-1d3a92dd270d","Type":"ContainerStarted","Data":"135ed522473def6429d8458aff130077f8733eb170e1395ff9e94fcf67d9cb23"} Mar 16 00:22:26 crc kubenswrapper[4816]: I0316 00:22:26.335461 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-tfv44" event={"ID":"562f24fe-5c4c-4540-96ae-6e01f539141b","Type":"ContainerStarted","Data":"4e5b9318c1f0f7a04ad87c88c6d75e4c42de00878e25b32bc150300ec987975e"} Mar 16 00:22:26 crc kubenswrapper[4816]: I0316 00:22:26.336910 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-6d8d794bc-2s9sk" event={"ID":"9a808114-3164-4abe-a481-1b5d3b9df2a0","Type":"ContainerStarted","Data":"b9994a1aa8a861ee26befc56024cc3fe17c6f7e7ffbbe13fd66ae13f48d231ff"} Mar 16 00:22:30 crc kubenswrapper[4816]: I0316 00:22:30.499110 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-jjrq7" Mar 16 00:22:30 crc kubenswrapper[4816]: I0316 00:22:30.499636 4816 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-jjrq7" Mar 16 00:22:30 crc kubenswrapper[4816]: I0316 00:22:30.611278 4816 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-jjrq7" Mar 16 00:22:30 crc kubenswrapper[4816]: I0316 00:22:30.767050 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/elastic-operator-8494b79c9c-fmbm9"] Mar 16 00:22:30 crc kubenswrapper[4816]: I0316 00:22:30.767774 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/elastic-operator-8494b79c9c-fmbm9" Mar 16 00:22:30 crc kubenswrapper[4816]: I0316 00:22:30.769831 4816 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"elastic-operator-service-cert" Mar 16 00:22:30 crc kubenswrapper[4816]: I0316 00:22:30.770072 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"openshift-service-ca.crt" Mar 16 00:22:30 crc kubenswrapper[4816]: I0316 00:22:30.770623 4816 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"elastic-operator-dockercfg-xhl6t" Mar 16 00:22:30 crc kubenswrapper[4816]: I0316 00:22:30.770940 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"kube-root-ca.crt" Mar 16 00:22:30 crc kubenswrapper[4816]: I0316 00:22:30.797317 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/elastic-operator-8494b79c9c-fmbm9"] Mar 16 00:22:30 crc kubenswrapper[4816]: I0316 00:22:30.878168 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/0c23dce2-a24c-4f57-9311-56675376c95e-apiservice-cert\") pod \"elastic-operator-8494b79c9c-fmbm9\" (UID: \"0c23dce2-a24c-4f57-9311-56675376c95e\") " pod="service-telemetry/elastic-operator-8494b79c9c-fmbm9" Mar 16 00:22:30 crc kubenswrapper[4816]: I0316 00:22:30.878205 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/0c23dce2-a24c-4f57-9311-56675376c95e-webhook-cert\") pod \"elastic-operator-8494b79c9c-fmbm9\" (UID: \"0c23dce2-a24c-4f57-9311-56675376c95e\") " pod="service-telemetry/elastic-operator-8494b79c9c-fmbm9" Mar 16 00:22:30 crc kubenswrapper[4816]: I0316 00:22:30.878239 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ch7rt\" (UniqueName: \"kubernetes.io/projected/0c23dce2-a24c-4f57-9311-56675376c95e-kube-api-access-ch7rt\") pod \"elastic-operator-8494b79c9c-fmbm9\" (UID: \"0c23dce2-a24c-4f57-9311-56675376c95e\") " pod="service-telemetry/elastic-operator-8494b79c9c-fmbm9" Mar 16 00:22:30 crc kubenswrapper[4816]: I0316 00:22:30.980441 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/0c23dce2-a24c-4f57-9311-56675376c95e-apiservice-cert\") pod \"elastic-operator-8494b79c9c-fmbm9\" (UID: \"0c23dce2-a24c-4f57-9311-56675376c95e\") " pod="service-telemetry/elastic-operator-8494b79c9c-fmbm9" Mar 16 00:22:30 crc kubenswrapper[4816]: I0316 00:22:30.980487 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/0c23dce2-a24c-4f57-9311-56675376c95e-webhook-cert\") pod \"elastic-operator-8494b79c9c-fmbm9\" (UID: \"0c23dce2-a24c-4f57-9311-56675376c95e\") " pod="service-telemetry/elastic-operator-8494b79c9c-fmbm9" Mar 16 00:22:30 crc kubenswrapper[4816]: I0316 00:22:30.980517 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ch7rt\" (UniqueName: \"kubernetes.io/projected/0c23dce2-a24c-4f57-9311-56675376c95e-kube-api-access-ch7rt\") pod \"elastic-operator-8494b79c9c-fmbm9\" (UID: \"0c23dce2-a24c-4f57-9311-56675376c95e\") " pod="service-telemetry/elastic-operator-8494b79c9c-fmbm9" Mar 16 00:22:30 crc kubenswrapper[4816]: I0316 00:22:30.992423 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/0c23dce2-a24c-4f57-9311-56675376c95e-apiservice-cert\") pod \"elastic-operator-8494b79c9c-fmbm9\" (UID: \"0c23dce2-a24c-4f57-9311-56675376c95e\") " pod="service-telemetry/elastic-operator-8494b79c9c-fmbm9" Mar 16 00:22:30 crc kubenswrapper[4816]: I0316 00:22:30.994928 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/0c23dce2-a24c-4f57-9311-56675376c95e-webhook-cert\") pod \"elastic-operator-8494b79c9c-fmbm9\" (UID: \"0c23dce2-a24c-4f57-9311-56675376c95e\") " pod="service-telemetry/elastic-operator-8494b79c9c-fmbm9" Mar 16 00:22:31 crc kubenswrapper[4816]: I0316 00:22:31.014443 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ch7rt\" (UniqueName: \"kubernetes.io/projected/0c23dce2-a24c-4f57-9311-56675376c95e-kube-api-access-ch7rt\") pod \"elastic-operator-8494b79c9c-fmbm9\" (UID: \"0c23dce2-a24c-4f57-9311-56675376c95e\") " pod="service-telemetry/elastic-operator-8494b79c9c-fmbm9" Mar 16 00:22:31 crc kubenswrapper[4816]: I0316 00:22:31.088178 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/elastic-operator-8494b79c9c-fmbm9" Mar 16 00:22:31 crc kubenswrapper[4816]: I0316 00:22:31.449209 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-jjrq7" Mar 16 00:22:31 crc kubenswrapper[4816]: I0316 00:22:31.761799 4816 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-284nb" Mar 16 00:22:31 crc kubenswrapper[4816]: I0316 00:22:31.824077 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-284nb" Mar 16 00:22:31 crc kubenswrapper[4816]: I0316 00:22:31.863176 4816 patch_prober.go:28] interesting pod/machine-config-daemon-jrdcz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 16 00:22:31 crc kubenswrapper[4816]: I0316 00:22:31.863238 4816 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jrdcz" podUID="dd08ece2-7636-4966-973a-e96a34b70b53" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 16 00:22:31 crc kubenswrapper[4816]: I0316 00:22:31.863299 4816 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-jrdcz" Mar 16 00:22:31 crc kubenswrapper[4816]: I0316 00:22:31.864201 4816 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"d940a23c182654ea98c304045d406af01d62b828901045324158f53e5e4988ad"} pod="openshift-machine-config-operator/machine-config-daemon-jrdcz" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 16 00:22:31 crc kubenswrapper[4816]: I0316 00:22:31.864267 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-jrdcz" podUID="dd08ece2-7636-4966-973a-e96a34b70b53" containerName="machine-config-daemon" containerID="cri-o://d940a23c182654ea98c304045d406af01d62b828901045324158f53e5e4988ad" gracePeriod=600 Mar 16 00:22:32 crc kubenswrapper[4816]: I0316 00:22:32.396481 4816 generic.go:334] "Generic (PLEG): container finished" podID="dd08ece2-7636-4966-973a-e96a34b70b53" containerID="d940a23c182654ea98c304045d406af01d62b828901045324158f53e5e4988ad" exitCode=0 Mar 16 00:22:32 crc kubenswrapper[4816]: I0316 00:22:32.396911 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-jrdcz" event={"ID":"dd08ece2-7636-4966-973a-e96a34b70b53","Type":"ContainerDied","Data":"d940a23c182654ea98c304045d406af01d62b828901045324158f53e5e4988ad"} Mar 16 00:22:32 crc kubenswrapper[4816]: I0316 00:22:32.397003 4816 scope.go:117] "RemoveContainer" containerID="054dcd9294a0533063364a3ea7e009e513fea0236f1afad37201a02a85a0eee4" Mar 16 00:22:33 crc kubenswrapper[4816]: I0316 00:22:33.153265 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/interconnect-operator-5bb49f789d-kd6nx"] Mar 16 00:22:33 crc kubenswrapper[4816]: I0316 00:22:33.154125 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/interconnect-operator-5bb49f789d-kd6nx" Mar 16 00:22:33 crc kubenswrapper[4816]: I0316 00:22:33.173122 4816 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"interconnect-operator-dockercfg-xgl2z" Mar 16 00:22:33 crc kubenswrapper[4816]: I0316 00:22:33.211659 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/interconnect-operator-5bb49f789d-kd6nx"] Mar 16 00:22:33 crc kubenswrapper[4816]: I0316 00:22:33.313992 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kpjjm\" (UniqueName: \"kubernetes.io/projected/f26ba6ee-c940-434d-80fe-81c813576ac9-kube-api-access-kpjjm\") pod \"interconnect-operator-5bb49f789d-kd6nx\" (UID: \"f26ba6ee-c940-434d-80fe-81c813576ac9\") " pod="service-telemetry/interconnect-operator-5bb49f789d-kd6nx" Mar 16 00:22:33 crc kubenswrapper[4816]: I0316 00:22:33.415626 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kpjjm\" (UniqueName: \"kubernetes.io/projected/f26ba6ee-c940-434d-80fe-81c813576ac9-kube-api-access-kpjjm\") pod \"interconnect-operator-5bb49f789d-kd6nx\" (UID: \"f26ba6ee-c940-434d-80fe-81c813576ac9\") " pod="service-telemetry/interconnect-operator-5bb49f789d-kd6nx" Mar 16 00:22:33 crc kubenswrapper[4816]: I0316 00:22:33.442774 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kpjjm\" (UniqueName: \"kubernetes.io/projected/f26ba6ee-c940-434d-80fe-81c813576ac9-kube-api-access-kpjjm\") pod \"interconnect-operator-5bb49f789d-kd6nx\" (UID: \"f26ba6ee-c940-434d-80fe-81c813576ac9\") " pod="service-telemetry/interconnect-operator-5bb49f789d-kd6nx" Mar 16 00:22:33 crc kubenswrapper[4816]: I0316 00:22:33.468906 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/interconnect-operator-5bb49f789d-kd6nx" Mar 16 00:22:35 crc kubenswrapper[4816]: I0316 00:22:35.179206 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-jjrq7"] Mar 16 00:22:35 crc kubenswrapper[4816]: I0316 00:22:35.179421 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-jjrq7" podUID="d502b2b7-9ca7-4b92-bb7f-d1639a6a7ef2" containerName="registry-server" containerID="cri-o://f43a292651495e3c7ec54a104d349e2f6851097dd5f8683239c8c438dec4317a" gracePeriod=2 Mar 16 00:22:36 crc kubenswrapper[4816]: I0316 00:22:36.169448 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-284nb"] Mar 16 00:22:36 crc kubenswrapper[4816]: I0316 00:22:36.169684 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-284nb" podUID="449b2c21-4396-4d46-af73-e670b282f831" containerName="registry-server" containerID="cri-o://408f0d5869a150801603b166f7b7332bed6e990ef8c97b1a3a5ad99e36233861" gracePeriod=2 Mar 16 00:22:36 crc kubenswrapper[4816]: I0316 00:22:36.436280 4816 generic.go:334] "Generic (PLEG): container finished" podID="d502b2b7-9ca7-4b92-bb7f-d1639a6a7ef2" containerID="f43a292651495e3c7ec54a104d349e2f6851097dd5f8683239c8c438dec4317a" exitCode=0 Mar 16 00:22:36 crc kubenswrapper[4816]: I0316 00:22:36.436327 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jjrq7" event={"ID":"d502b2b7-9ca7-4b92-bb7f-d1639a6a7ef2","Type":"ContainerDied","Data":"f43a292651495e3c7ec54a104d349e2f6851097dd5f8683239c8c438dec4317a"} Mar 16 00:22:37 crc kubenswrapper[4816]: I0316 00:22:37.446675 4816 generic.go:334] "Generic (PLEG): container finished" podID="449b2c21-4396-4d46-af73-e670b282f831" containerID="408f0d5869a150801603b166f7b7332bed6e990ef8c97b1a3a5ad99e36233861" exitCode=0 Mar 16 00:22:37 crc kubenswrapper[4816]: I0316 00:22:37.446755 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-284nb" event={"ID":"449b2c21-4396-4d46-af73-e670b282f831","Type":"ContainerDied","Data":"408f0d5869a150801603b166f7b7332bed6e990ef8c97b1a3a5ad99e36233861"} Mar 16 00:22:38 crc kubenswrapper[4816]: E0316 00:22:38.792517 4816 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/cluster-observability-operator/obo-prometheus-operator-admission-webhook-rhel9@sha256:42ebc3571195d8c41fd01b8d08e98fe2cc12c1caabea251aecb4442d8eade4ea" Mar 16 00:22:38 crc kubenswrapper[4816]: E0316 00:22:38.793008 4816 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:prometheus-operator-admission-webhook,Image:registry.redhat.io/cluster-observability-operator/obo-prometheus-operator-admission-webhook-rhel9@sha256:42ebc3571195d8c41fd01b8d08e98fe2cc12c1caabea251aecb4442d8eade4ea,Command:[],Args:[--web.enable-tls=true --web.cert-file=/tmp/k8s-webhook-server/serving-certs/tls.crt --web.key-file=/tmp/k8s-webhook-server/serving-certs/tls.key],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:OPERATOR_CONDITION_NAME,Value:cluster-observability-operator.v1.3.1,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{200 -3} {} 200m DecimalSI},memory: {{209715200 0} {} BinarySI},},Requests:ResourceList{cpu: {{50 -3} {} 50m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:apiservice-cert,ReadOnly:false,MountPath:/apiserver.local.config/certificates,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:webhook-cert,ReadOnly:false,MountPath:/tmp/k8s-webhook-server/serving-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:*true,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod obo-prometheus-operator-admission-webhook-6d8d794bc-2s9sk_openshift-operators(9a808114-3164-4abe-a481-1b5d3b9df2a0): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 16 00:22:38 crc kubenswrapper[4816]: E0316 00:22:38.794566 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"prometheus-operator-admission-webhook\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-operators/obo-prometheus-operator-admission-webhook-6d8d794bc-2s9sk" podUID="9a808114-3164-4abe-a481-1b5d3b9df2a0" Mar 16 00:22:39 crc kubenswrapper[4816]: E0316 00:22:39.474561 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"prometheus-operator-admission-webhook\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/cluster-observability-operator/obo-prometheus-operator-admission-webhook-rhel9@sha256:42ebc3571195d8c41fd01b8d08e98fe2cc12c1caabea251aecb4442d8eade4ea\\\"\"" pod="openshift-operators/obo-prometheus-operator-admission-webhook-6d8d794bc-2s9sk" podUID="9a808114-3164-4abe-a481-1b5d3b9df2a0" Mar 16 00:22:40 crc kubenswrapper[4816]: E0316 00:22:40.499696 4816 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of f43a292651495e3c7ec54a104d349e2f6851097dd5f8683239c8c438dec4317a is running failed: container process not found" containerID="f43a292651495e3c7ec54a104d349e2f6851097dd5f8683239c8c438dec4317a" cmd=["grpc_health_probe","-addr=:50051"] Mar 16 00:22:40 crc kubenswrapper[4816]: E0316 00:22:40.500696 4816 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of f43a292651495e3c7ec54a104d349e2f6851097dd5f8683239c8c438dec4317a is running failed: container process not found" containerID="f43a292651495e3c7ec54a104d349e2f6851097dd5f8683239c8c438dec4317a" cmd=["grpc_health_probe","-addr=:50051"] Mar 16 00:22:40 crc kubenswrapper[4816]: E0316 00:22:40.504950 4816 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of f43a292651495e3c7ec54a104d349e2f6851097dd5f8683239c8c438dec4317a is running failed: container process not found" containerID="f43a292651495e3c7ec54a104d349e2f6851097dd5f8683239c8c438dec4317a" cmd=["grpc_health_probe","-addr=:50051"] Mar 16 00:22:40 crc kubenswrapper[4816]: E0316 00:22:40.505056 4816 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of f43a292651495e3c7ec54a104d349e2f6851097dd5f8683239c8c438dec4317a is running failed: container process not found" probeType="Readiness" pod="openshift-marketplace/certified-operators-jjrq7" podUID="d502b2b7-9ca7-4b92-bb7f-d1639a6a7ef2" containerName="registry-server" Mar 16 00:22:41 crc kubenswrapper[4816]: E0316 00:22:41.726015 4816 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 408f0d5869a150801603b166f7b7332bed6e990ef8c97b1a3a5ad99e36233861 is running failed: container process not found" containerID="408f0d5869a150801603b166f7b7332bed6e990ef8c97b1a3a5ad99e36233861" cmd=["grpc_health_probe","-addr=:50051"] Mar 16 00:22:41 crc kubenswrapper[4816]: E0316 00:22:41.726452 4816 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 408f0d5869a150801603b166f7b7332bed6e990ef8c97b1a3a5ad99e36233861 is running failed: container process not found" containerID="408f0d5869a150801603b166f7b7332bed6e990ef8c97b1a3a5ad99e36233861" cmd=["grpc_health_probe","-addr=:50051"] Mar 16 00:22:41 crc kubenswrapper[4816]: E0316 00:22:41.726842 4816 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 408f0d5869a150801603b166f7b7332bed6e990ef8c97b1a3a5ad99e36233861 is running failed: container process not found" containerID="408f0d5869a150801603b166f7b7332bed6e990ef8c97b1a3a5ad99e36233861" cmd=["grpc_health_probe","-addr=:50051"] Mar 16 00:22:41 crc kubenswrapper[4816]: E0316 00:22:41.726888 4816 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 408f0d5869a150801603b166f7b7332bed6e990ef8c97b1a3a5ad99e36233861 is running failed: container process not found" probeType="Readiness" pod="openshift-marketplace/redhat-operators-284nb" podUID="449b2c21-4396-4d46-af73-e670b282f831" containerName="registry-server" Mar 16 00:22:42 crc kubenswrapper[4816]: E0316 00:22:42.511107 4816 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/cluster-observability-operator/cluster-observability-rhel9-operator@sha256:2ecf763b02048d2cf4c17967a7b2cacc7afd6af0e963a39579d876f8f4170e3c" Mar 16 00:22:42 crc kubenswrapper[4816]: E0316 00:22:42.511351 4816 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:operator,Image:registry.redhat.io/cluster-observability-operator/cluster-observability-rhel9-operator@sha256:2ecf763b02048d2cf4c17967a7b2cacc7afd6af0e963a39579d876f8f4170e3c,Command:[],Args:[--namespace=$(NAMESPACE) --images=perses=$(RELATED_IMAGE_PERSES) --images=alertmanager=$(RELATED_IMAGE_ALERTMANAGER) --images=prometheus=$(RELATED_IMAGE_PROMETHEUS) --images=thanos=$(RELATED_IMAGE_THANOS) --images=ui-dashboards=$(RELATED_IMAGE_CONSOLE_DASHBOARDS_PLUGIN) --images=ui-distributed-tracing=$(RELATED_IMAGE_CONSOLE_DISTRIBUTED_TRACING_PLUGIN) --images=ui-distributed-tracing-pf5=$(RELATED_IMAGE_CONSOLE_DISTRIBUTED_TRACING_PLUGIN_PF5) --images=ui-distributed-tracing-pf4=$(RELATED_IMAGE_CONSOLE_DISTRIBUTED_TRACING_PLUGIN_PF4) --images=ui-logging=$(RELATED_IMAGE_CONSOLE_LOGGING_PLUGIN) --images=ui-logging-pf4=$(RELATED_IMAGE_CONSOLE_LOGGING_PLUGIN_PF4) --images=ui-troubleshooting-panel=$(RELATED_IMAGE_CONSOLE_TROUBLESHOOTING_PANEL_PLUGIN) --images=ui-monitoring=$(RELATED_IMAGE_CONSOLE_MONITORING_PLUGIN) --images=ui-monitoring-pf5=$(RELATED_IMAGE_CONSOLE_MONITORING_PLUGIN_PF5) --images=korrel8r=$(RELATED_IMAGE_KORREL8R) --images=health-analyzer=$(RELATED_IMAGE_CLUSTER_HEALTH_ANALYZER) --openshift.enabled=true],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:NAMESPACE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.namespace,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:RELATED_IMAGE_ALERTMANAGER,Value:registry.redhat.io/cluster-observability-operator/alertmanager-rhel9@sha256:dc62889b883f597de91b5389cc52c84c607247d49a807693be2f688e4703dfc3,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_PROMETHEUS,Value:registry.redhat.io/cluster-observability-operator/prometheus-rhel9@sha256:1b555e21bba7c609111ace4380382a696d9aceeb6e9816bf9023b8f689b6c741,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_THANOS,Value:registry.redhat.io/cluster-observability-operator/thanos-rhel9@sha256:a223bab813b82d698992490bbb60927f6288a83ba52d539836c250e1471f6d34,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_PERSES,Value:registry.redhat.io/cluster-observability-operator/perses-rhel9@sha256:e797cdb47beef40b04da7b6d645bca3dc32e6247003c45b56b38efd9e13bf01c,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CONSOLE_DASHBOARDS_PLUGIN,Value:registry.redhat.io/cluster-observability-operator/dashboards-console-plugin-rhel9@sha256:093d2731ac848ed5fd57356b155a19d3bf7b8db96d95b09c5d0095e143f7254f,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CONSOLE_DISTRIBUTED_TRACING_PLUGIN,Value:registry.redhat.io/cluster-observability-operator/distributed-tracing-console-plugin-rhel9@sha256:7d662a120305e2528acc7e9142b770b5b6a7f4932ddfcadfa4ac953935124895,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CONSOLE_DISTRIBUTED_TRACING_PLUGIN_PF5,Value:registry.redhat.io/cluster-observability-operator/distributed-tracing-console-plugin-pf5-rhel9@sha256:75465aabb0aa427a5c531a8fcde463f6d119afbcc618ebcbf6b7ee9bc8aad160,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CONSOLE_DISTRIBUTED_TRACING_PLUGIN_PF4,Value:registry.redhat.io/cluster-observability-operator/distributed-tracing-console-plugin-pf4-rhel9@sha256:dc18c8d6a4a9a0a574a57cc5082c8a9b26023bd6d69b9732892d584c1dfe5070,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CONSOLE_LOGGING_PLUGIN,Value:registry.redhat.io/cluster-observability-operator/logging-console-plugin-rhel9@sha256:369729978cecdc13c99ef3d179f8eb8a450a4a0cb70b63c27a55a15d1710ba27,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CONSOLE_LOGGING_PLUGIN_PF4,Value:registry.redhat.io/cluster-observability-operator/logging-console-plugin-pf4-rhel9@sha256:d8c7a61d147f62b204d5c5f16864386025393453c9a81ea327bbd25d7765d611,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CONSOLE_TROUBLESHOOTING_PANEL_PLUGIN,Value:registry.redhat.io/cluster-observability-operator/troubleshooting-panel-console-plugin-rhel9@sha256:b4a6eb1cc118a4334b424614959d8b7f361ddd779b3a72690ca49b0a3f26d9b8,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CONSOLE_MONITORING_PLUGIN,Value:registry.redhat.io/cluster-observability-operator/monitoring-console-plugin-rhel9@sha256:21d4fff670893ba4b7fbc528cd49f8b71c8281cede9ef84f0697065bb6a7fc50,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CONSOLE_MONITORING_PLUGIN_PF5,Value:registry.redhat.io/cluster-observability-operator/monitoring-console-plugin-pf5-rhel9@sha256:12d9dbe297a1c3b9df671f21156992082bc483887d851fafe76e5d17321ff474,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_KORREL8R,Value:registry.redhat.io/cluster-observability-operator/korrel8r-rhel9@sha256:e65c37f04f6d76a0cbfe05edb3cddf6a8f14f859ee35cf3aebea8fcb991d2c19,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CLUSTER_HEALTH_ANALYZER,Value:registry.redhat.io/cluster-observability-operator/cluster-health-analyzer-rhel9@sha256:48e4e178c6eeaa9d5dd77a591c185a311b4b4a5caadb7199d48463123e31dc9e,ValueFrom:nil,},EnvVar{Name:OPERATOR_CONDITION_NAME,Value:cluster-observability-operator.v1.3.1,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{400 -3} {} 400m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{100 -3} {} 100m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:observability-operator-tls,ReadOnly:true,MountPath:/etc/tls/private,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-mz5bq,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000350000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod observability-operator-59bdc8b94-w6wv7_openshift-operators(8d0f60fa-8d26-43ea-a680-1d3a92dd270d): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 16 00:22:42 crc kubenswrapper[4816]: E0316 00:22:42.513185 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-operators/observability-operator-59bdc8b94-w6wv7" podUID="8d0f60fa-8d26-43ea-a680-1d3a92dd270d" Mar 16 00:22:42 crc kubenswrapper[4816]: E0316 00:22:42.643625 4816 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="registry.redhat.io/cert-manager/cert-manager-operator-bundle@sha256:e4e3f81062da90a9cfcdce27085f0624952374a9aec5fbdd5796a09d24f83908" Mar 16 00:22:42 crc kubenswrapper[4816]: E0316 00:22:42.643803 4816 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:pull,Image:registry.redhat.io/cert-manager/cert-manager-operator-bundle@sha256:e4e3f81062da90a9cfcdce27085f0624952374a9aec5fbdd5796a09d24f83908,Command:[/util/cpb /bundle],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:bundle,ReadOnly:false,MountPath:/bundle,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:util,ReadOnly:false,MountPath:/util,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-4582c,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod 925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e59v87l_openshift-marketplace(1da45fda-a8cc-46c1-8831-58418ecc9819): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 16 00:22:42 crc kubenswrapper[4816]: E0316 00:22:42.645212 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"pull\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e59v87l" podUID="1da45fda-a8cc-46c1-8831-58418ecc9819" Mar 16 00:22:43 crc kubenswrapper[4816]: I0316 00:22:43.084182 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-284nb" Mar 16 00:22:43 crc kubenswrapper[4816]: I0316 00:22:43.153940 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/449b2c21-4396-4d46-af73-e670b282f831-catalog-content\") pod \"449b2c21-4396-4d46-af73-e670b282f831\" (UID: \"449b2c21-4396-4d46-af73-e670b282f831\") " Mar 16 00:22:43 crc kubenswrapper[4816]: I0316 00:22:43.154023 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/449b2c21-4396-4d46-af73-e670b282f831-utilities\") pod \"449b2c21-4396-4d46-af73-e670b282f831\" (UID: \"449b2c21-4396-4d46-af73-e670b282f831\") " Mar 16 00:22:43 crc kubenswrapper[4816]: I0316 00:22:43.154073 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-44s4s\" (UniqueName: \"kubernetes.io/projected/449b2c21-4396-4d46-af73-e670b282f831-kube-api-access-44s4s\") pod \"449b2c21-4396-4d46-af73-e670b282f831\" (UID: \"449b2c21-4396-4d46-af73-e670b282f831\") " Mar 16 00:22:43 crc kubenswrapper[4816]: I0316 00:22:43.155692 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/449b2c21-4396-4d46-af73-e670b282f831-utilities" (OuterVolumeSpecName: "utilities") pod "449b2c21-4396-4d46-af73-e670b282f831" (UID: "449b2c21-4396-4d46-af73-e670b282f831"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 16 00:22:43 crc kubenswrapper[4816]: I0316 00:22:43.164199 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/449b2c21-4396-4d46-af73-e670b282f831-kube-api-access-44s4s" (OuterVolumeSpecName: "kube-api-access-44s4s") pod "449b2c21-4396-4d46-af73-e670b282f831" (UID: "449b2c21-4396-4d46-af73-e670b282f831"). InnerVolumeSpecName "kube-api-access-44s4s". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:22:43 crc kubenswrapper[4816]: I0316 00:22:43.255474 4816 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/449b2c21-4396-4d46-af73-e670b282f831-utilities\") on node \"crc\" DevicePath \"\"" Mar 16 00:22:43 crc kubenswrapper[4816]: I0316 00:22:43.255854 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-44s4s\" (UniqueName: \"kubernetes.io/projected/449b2c21-4396-4d46-af73-e670b282f831-kube-api-access-44s4s\") on node \"crc\" DevicePath \"\"" Mar 16 00:22:43 crc kubenswrapper[4816]: I0316 00:22:43.293795 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/interconnect-operator-5bb49f789d-kd6nx"] Mar 16 00:22:43 crc kubenswrapper[4816]: I0316 00:22:43.294440 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-jjrq7" Mar 16 00:22:43 crc kubenswrapper[4816]: W0316 00:22:43.301715 4816 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf26ba6ee_c940_434d_80fe_81c813576ac9.slice/crio-466dbd55112a9fcfa484c0e2ae0d33ac4f4113021db62a805d38c5c3536098c2 WatchSource:0}: Error finding container 466dbd55112a9fcfa484c0e2ae0d33ac4f4113021db62a805d38c5c3536098c2: Status 404 returned error can't find the container with id 466dbd55112a9fcfa484c0e2ae0d33ac4f4113021db62a805d38c5c3536098c2 Mar 16 00:22:43 crc kubenswrapper[4816]: I0316 00:22:43.342528 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/449b2c21-4396-4d46-af73-e670b282f831-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "449b2c21-4396-4d46-af73-e670b282f831" (UID: "449b2c21-4396-4d46-af73-e670b282f831"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 16 00:22:43 crc kubenswrapper[4816]: I0316 00:22:43.356653 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d502b2b7-9ca7-4b92-bb7f-d1639a6a7ef2-catalog-content\") pod \"d502b2b7-9ca7-4b92-bb7f-d1639a6a7ef2\" (UID: \"d502b2b7-9ca7-4b92-bb7f-d1639a6a7ef2\") " Mar 16 00:22:43 crc kubenswrapper[4816]: I0316 00:22:43.356737 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-62z8f\" (UniqueName: \"kubernetes.io/projected/d502b2b7-9ca7-4b92-bb7f-d1639a6a7ef2-kube-api-access-62z8f\") pod \"d502b2b7-9ca7-4b92-bb7f-d1639a6a7ef2\" (UID: \"d502b2b7-9ca7-4b92-bb7f-d1639a6a7ef2\") " Mar 16 00:22:43 crc kubenswrapper[4816]: I0316 00:22:43.356840 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d502b2b7-9ca7-4b92-bb7f-d1639a6a7ef2-utilities\") pod \"d502b2b7-9ca7-4b92-bb7f-d1639a6a7ef2\" (UID: \"d502b2b7-9ca7-4b92-bb7f-d1639a6a7ef2\") " Mar 16 00:22:43 crc kubenswrapper[4816]: I0316 00:22:43.357084 4816 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/449b2c21-4396-4d46-af73-e670b282f831-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 16 00:22:43 crc kubenswrapper[4816]: I0316 00:22:43.358347 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d502b2b7-9ca7-4b92-bb7f-d1639a6a7ef2-utilities" (OuterVolumeSpecName: "utilities") pod "d502b2b7-9ca7-4b92-bb7f-d1639a6a7ef2" (UID: "d502b2b7-9ca7-4b92-bb7f-d1639a6a7ef2"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 16 00:22:43 crc kubenswrapper[4816]: I0316 00:22:43.366636 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d502b2b7-9ca7-4b92-bb7f-d1639a6a7ef2-kube-api-access-62z8f" (OuterVolumeSpecName: "kube-api-access-62z8f") pod "d502b2b7-9ca7-4b92-bb7f-d1639a6a7ef2" (UID: "d502b2b7-9ca7-4b92-bb7f-d1639a6a7ef2"). InnerVolumeSpecName "kube-api-access-62z8f". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:22:43 crc kubenswrapper[4816]: I0316 00:22:43.419275 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/elastic-operator-8494b79c9c-fmbm9"] Mar 16 00:22:43 crc kubenswrapper[4816]: W0316 00:22:43.428063 4816 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0c23dce2_a24c_4f57_9311_56675376c95e.slice/crio-307965d3063d52c3beb9c4eb556854acc9650d4c029c1214cf342be34f6b3bb0 WatchSource:0}: Error finding container 307965d3063d52c3beb9c4eb556854acc9650d4c029c1214cf342be34f6b3bb0: Status 404 returned error can't find the container with id 307965d3063d52c3beb9c4eb556854acc9650d4c029c1214cf342be34f6b3bb0 Mar 16 00:22:43 crc kubenswrapper[4816]: I0316 00:22:43.437575 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d502b2b7-9ca7-4b92-bb7f-d1639a6a7ef2-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d502b2b7-9ca7-4b92-bb7f-d1639a6a7ef2" (UID: "d502b2b7-9ca7-4b92-bb7f-d1639a6a7ef2"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 16 00:22:43 crc kubenswrapper[4816]: I0316 00:22:43.458174 4816 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d502b2b7-9ca7-4b92-bb7f-d1639a6a7ef2-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 16 00:22:43 crc kubenswrapper[4816]: I0316 00:22:43.458215 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-62z8f\" (UniqueName: \"kubernetes.io/projected/d502b2b7-9ca7-4b92-bb7f-d1639a6a7ef2-kube-api-access-62z8f\") on node \"crc\" DevicePath \"\"" Mar 16 00:22:43 crc kubenswrapper[4816]: I0316 00:22:43.458231 4816 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d502b2b7-9ca7-4b92-bb7f-d1639a6a7ef2-utilities\") on node \"crc\" DevicePath \"\"" Mar 16 00:22:43 crc kubenswrapper[4816]: I0316 00:22:43.494605 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/interconnect-operator-5bb49f789d-kd6nx" event={"ID":"f26ba6ee-c940-434d-80fe-81c813576ac9","Type":"ContainerStarted","Data":"466dbd55112a9fcfa484c0e2ae0d33ac4f4113021db62a805d38c5c3536098c2"} Mar 16 00:22:43 crc kubenswrapper[4816]: I0316 00:22:43.496743 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jjrq7" event={"ID":"d502b2b7-9ca7-4b92-bb7f-d1639a6a7ef2","Type":"ContainerDied","Data":"5750fa2beb8413e4bbf66bc2e4ec103cf698ec3743191fcfdc001a528f09e12f"} Mar 16 00:22:43 crc kubenswrapper[4816]: I0316 00:22:43.496782 4816 scope.go:117] "RemoveContainer" containerID="f43a292651495e3c7ec54a104d349e2f6851097dd5f8683239c8c438dec4317a" Mar 16 00:22:43 crc kubenswrapper[4816]: I0316 00:22:43.496882 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-jjrq7" Mar 16 00:22:43 crc kubenswrapper[4816]: I0316 00:22:43.501078 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/perses-operator-5bf474d74f-t7w7m" event={"ID":"f24959c1-f57f-4bf6-8a55-c8a35173ff8b","Type":"ContainerStarted","Data":"230879dbf66fa1b269eddaa463bf2ce64a38c23de24acb9dfaabeaf7f4d8419a"} Mar 16 00:22:43 crc kubenswrapper[4816]: I0316 00:22:43.501747 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operators/perses-operator-5bf474d74f-t7w7m" Mar 16 00:22:43 crc kubenswrapper[4816]: I0316 00:22:43.506965 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/elastic-operator-8494b79c9c-fmbm9" event={"ID":"0c23dce2-a24c-4f57-9311-56675376c95e","Type":"ContainerStarted","Data":"307965d3063d52c3beb9c4eb556854acc9650d4c029c1214cf342be34f6b3bb0"} Mar 16 00:22:43 crc kubenswrapper[4816]: I0316 00:22:43.527836 4816 scope.go:117] "RemoveContainer" containerID="79acb07939ccc5b80b6a155b3eb5d1de07ee6d3e3a71e7de4d918cb8ef32d1ed" Mar 16 00:22:43 crc kubenswrapper[4816]: I0316 00:22:43.528036 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-jrdcz" event={"ID":"dd08ece2-7636-4966-973a-e96a34b70b53","Type":"ContainerStarted","Data":"d963d56deb174bcc1b2f530e646e1a1dbd328868a82631422f67c019c313cf52"} Mar 16 00:22:43 crc kubenswrapper[4816]: I0316 00:22:43.538955 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/perses-operator-5bf474d74f-t7w7m" podStartSLOduration=1.956176624 podStartE2EDuration="18.538936706s" podCreationTimestamp="2026-03-16 00:22:25 +0000 UTC" firstStartedPulling="2026-03-16 00:22:26.158507812 +0000 UTC m=+939.254807765" lastFinishedPulling="2026-03-16 00:22:42.741267904 +0000 UTC m=+955.837567847" observedRunningTime="2026-03-16 00:22:43.529318319 +0000 UTC m=+956.625618272" watchObservedRunningTime="2026-03-16 00:22:43.538936706 +0000 UTC m=+956.635236659" Mar 16 00:22:43 crc kubenswrapper[4816]: I0316 00:22:43.540676 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-284nb" Mar 16 00:22:43 crc kubenswrapper[4816]: E0316 00:22:43.553745 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/cluster-observability-operator/cluster-observability-rhel9-operator@sha256:2ecf763b02048d2cf4c17967a7b2cacc7afd6af0e963a39579d876f8f4170e3c\\\"\"" pod="openshift-operators/observability-operator-59bdc8b94-w6wv7" podUID="8d0f60fa-8d26-43ea-a680-1d3a92dd270d" Mar 16 00:22:43 crc kubenswrapper[4816]: I0316 00:22:43.562538 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-284nb" event={"ID":"449b2c21-4396-4d46-af73-e670b282f831","Type":"ContainerDied","Data":"d18b65047ef76f8aa8382e87295979d6bc4c30a17f5c2e1a21242e76f4c64c95"} Mar 16 00:22:43 crc kubenswrapper[4816]: I0316 00:22:43.562804 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-6d8d794bc-6bdkn" event={"ID":"36951342-3370-4291-baa3-2612f64036fd","Type":"ContainerStarted","Data":"56bfd05b1bd5295a86432769cfca977bfdcaf5320dded943aae99ff75ddd2b3d"} Mar 16 00:22:43 crc kubenswrapper[4816]: I0316 00:22:43.562820 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-tfv44" event={"ID":"562f24fe-5c4c-4540-96ae-6e01f539141b","Type":"ContainerStarted","Data":"0b033ee0ae1fd900374e0904d62ee38cb53bab56f3b6256c9a5ba725203718cb"} Mar 16 00:22:43 crc kubenswrapper[4816]: I0316 00:22:43.566891 4816 scope.go:117] "RemoveContainer" containerID="e5839a641f5e4b47edb9ba28f2918214558d3b4454dab06397a37f22bd1120b7" Mar 16 00:22:43 crc kubenswrapper[4816]: I0316 00:22:43.584589 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-jjrq7"] Mar 16 00:22:43 crc kubenswrapper[4816]: I0316 00:22:43.584645 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-jjrq7"] Mar 16 00:22:43 crc kubenswrapper[4816]: I0316 00:22:43.591051 4816 scope.go:117] "RemoveContainer" containerID="408f0d5869a150801603b166f7b7332bed6e990ef8c97b1a3a5ad99e36233861" Mar 16 00:22:43 crc kubenswrapper[4816]: I0316 00:22:43.620618 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-tfv44" podStartSLOduration=2.476263188 podStartE2EDuration="19.620582576s" podCreationTimestamp="2026-03-16 00:22:24 +0000 UTC" firstStartedPulling="2026-03-16 00:22:25.597119831 +0000 UTC m=+938.693419784" lastFinishedPulling="2026-03-16 00:22:42.741439219 +0000 UTC m=+955.837739172" observedRunningTime="2026-03-16 00:22:43.615325135 +0000 UTC m=+956.711625088" watchObservedRunningTime="2026-03-16 00:22:43.620582576 +0000 UTC m=+956.716882529" Mar 16 00:22:43 crc kubenswrapper[4816]: I0316 00:22:43.643730 4816 scope.go:117] "RemoveContainer" containerID="6675088a9bc77bc6e858928c42b983a7c8adb51e5f6f5dc372c160b6c32fce59" Mar 16 00:22:43 crc kubenswrapper[4816]: I0316 00:22:43.658952 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-admission-webhook-6d8d794bc-6bdkn" podStartSLOduration=2.917420158 podStartE2EDuration="19.65893207s" podCreationTimestamp="2026-03-16 00:22:24 +0000 UTC" firstStartedPulling="2026-03-16 00:22:25.999128564 +0000 UTC m=+939.095428517" lastFinishedPulling="2026-03-16 00:22:42.740640476 +0000 UTC m=+955.836940429" observedRunningTime="2026-03-16 00:22:43.656001486 +0000 UTC m=+956.752301439" watchObservedRunningTime="2026-03-16 00:22:43.65893207 +0000 UTC m=+956.755232023" Mar 16 00:22:43 crc kubenswrapper[4816]: I0316 00:22:43.682785 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d502b2b7-9ca7-4b92-bb7f-d1639a6a7ef2" path="/var/lib/kubelet/pods/d502b2b7-9ca7-4b92-bb7f-d1639a6a7ef2/volumes" Mar 16 00:22:43 crc kubenswrapper[4816]: I0316 00:22:43.708615 4816 scope.go:117] "RemoveContainer" containerID="1891eef8092d3a2994e7d2d76188c04ec2abc90123f32af569269e31014fdb64" Mar 16 00:22:43 crc kubenswrapper[4816]: I0316 00:22:43.720016 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-284nb"] Mar 16 00:22:43 crc kubenswrapper[4816]: I0316 00:22:43.741404 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-284nb"] Mar 16 00:22:45 crc kubenswrapper[4816]: I0316 00:22:45.676512 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="449b2c21-4396-4d46-af73-e670b282f831" path="/var/lib/kubelet/pods/449b2c21-4396-4d46-af73-e670b282f831/volumes" Mar 16 00:22:47 crc kubenswrapper[4816]: I0316 00:22:47.589972 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/elastic-operator-8494b79c9c-fmbm9" event={"ID":"0c23dce2-a24c-4f57-9311-56675376c95e","Type":"ContainerStarted","Data":"09c44097f0ad4bab89366142ec9f2d996be7ca94ef2aea5cae839d9f0610e896"} Mar 16 00:22:47 crc kubenswrapper[4816]: I0316 00:22:47.609369 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/elastic-operator-8494b79c9c-fmbm9" podStartSLOduration=14.341006016 podStartE2EDuration="17.609343462s" podCreationTimestamp="2026-03-16 00:22:30 +0000 UTC" firstStartedPulling="2026-03-16 00:22:43.431866175 +0000 UTC m=+956.528166128" lastFinishedPulling="2026-03-16 00:22:46.700203621 +0000 UTC m=+959.796503574" observedRunningTime="2026-03-16 00:22:47.605779309 +0000 UTC m=+960.702079262" watchObservedRunningTime="2026-03-16 00:22:47.609343462 +0000 UTC m=+960.705643415" Mar 16 00:22:47 crc kubenswrapper[4816]: I0316 00:22:47.944766 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/elasticsearch-es-default-0"] Mar 16 00:22:47 crc kubenswrapper[4816]: E0316 00:22:47.945235 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="449b2c21-4396-4d46-af73-e670b282f831" containerName="registry-server" Mar 16 00:22:47 crc kubenswrapper[4816]: I0316 00:22:47.945247 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="449b2c21-4396-4d46-af73-e670b282f831" containerName="registry-server" Mar 16 00:22:47 crc kubenswrapper[4816]: E0316 00:22:47.945260 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="449b2c21-4396-4d46-af73-e670b282f831" containerName="extract-content" Mar 16 00:22:47 crc kubenswrapper[4816]: I0316 00:22:47.945268 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="449b2c21-4396-4d46-af73-e670b282f831" containerName="extract-content" Mar 16 00:22:47 crc kubenswrapper[4816]: E0316 00:22:47.945276 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="449b2c21-4396-4d46-af73-e670b282f831" containerName="extract-utilities" Mar 16 00:22:47 crc kubenswrapper[4816]: I0316 00:22:47.945284 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="449b2c21-4396-4d46-af73-e670b282f831" containerName="extract-utilities" Mar 16 00:22:47 crc kubenswrapper[4816]: E0316 00:22:47.945292 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d502b2b7-9ca7-4b92-bb7f-d1639a6a7ef2" containerName="extract-content" Mar 16 00:22:47 crc kubenswrapper[4816]: I0316 00:22:47.945299 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="d502b2b7-9ca7-4b92-bb7f-d1639a6a7ef2" containerName="extract-content" Mar 16 00:22:47 crc kubenswrapper[4816]: E0316 00:22:47.945311 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d502b2b7-9ca7-4b92-bb7f-d1639a6a7ef2" containerName="extract-utilities" Mar 16 00:22:47 crc kubenswrapper[4816]: I0316 00:22:47.945317 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="d502b2b7-9ca7-4b92-bb7f-d1639a6a7ef2" containerName="extract-utilities" Mar 16 00:22:47 crc kubenswrapper[4816]: E0316 00:22:47.945333 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d502b2b7-9ca7-4b92-bb7f-d1639a6a7ef2" containerName="registry-server" Mar 16 00:22:47 crc kubenswrapper[4816]: I0316 00:22:47.945340 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="d502b2b7-9ca7-4b92-bb7f-d1639a6a7ef2" containerName="registry-server" Mar 16 00:22:47 crc kubenswrapper[4816]: I0316 00:22:47.945447 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="d502b2b7-9ca7-4b92-bb7f-d1639a6a7ef2" containerName="registry-server" Mar 16 00:22:47 crc kubenswrapper[4816]: I0316 00:22:47.945457 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="449b2c21-4396-4d46-af73-e670b282f831" containerName="registry-server" Mar 16 00:22:47 crc kubenswrapper[4816]: I0316 00:22:47.946339 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/elasticsearch-es-default-0" Mar 16 00:22:47 crc kubenswrapper[4816]: I0316 00:22:47.951037 4816 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"elasticsearch-es-remote-ca" Mar 16 00:22:47 crc kubenswrapper[4816]: I0316 00:22:47.951412 4816 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"elasticsearch-es-http-certs-internal" Mar 16 00:22:47 crc kubenswrapper[4816]: I0316 00:22:47.951579 4816 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"elasticsearch-es-default-es-transport-certs" Mar 16 00:22:47 crc kubenswrapper[4816]: I0316 00:22:47.951755 4816 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"elasticsearch-es-default-es-config" Mar 16 00:22:47 crc kubenswrapper[4816]: I0316 00:22:47.952216 4816 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"default-dockercfg-zjx5n" Mar 16 00:22:47 crc kubenswrapper[4816]: I0316 00:22:47.952468 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"elasticsearch-es-scripts" Mar 16 00:22:47 crc kubenswrapper[4816]: I0316 00:22:47.952528 4816 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"elasticsearch-es-xpack-file-realm" Mar 16 00:22:47 crc kubenswrapper[4816]: I0316 00:22:47.952673 4816 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"elasticsearch-es-internal-users" Mar 16 00:22:47 crc kubenswrapper[4816]: I0316 00:22:47.962860 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"elasticsearch-es-unicast-hosts" Mar 16 00:22:47 crc kubenswrapper[4816]: I0316 00:22:47.975917 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/elasticsearch-es-default-0"] Mar 16 00:22:48 crc kubenswrapper[4816]: I0316 00:22:48.073544 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"elastic-internal-probe-user\" (UniqueName: \"kubernetes.io/secret/819af9fc-6db9-4743-bd06-f844f5ef5b0d-elastic-internal-probe-user\") pod \"elasticsearch-es-default-0\" (UID: \"819af9fc-6db9-4743-bd06-f844f5ef5b0d\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 16 00:22:48 crc kubenswrapper[4816]: I0316 00:22:48.073611 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"elastic-internal-scripts\" (UniqueName: \"kubernetes.io/configmap/819af9fc-6db9-4743-bd06-f844f5ef5b0d-elastic-internal-scripts\") pod \"elasticsearch-es-default-0\" (UID: \"819af9fc-6db9-4743-bd06-f844f5ef5b0d\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 16 00:22:48 crc kubenswrapper[4816]: I0316 00:22:48.073646 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"elastic-internal-elasticsearch-config\" (UniqueName: \"kubernetes.io/secret/819af9fc-6db9-4743-bd06-f844f5ef5b0d-elastic-internal-elasticsearch-config\") pod \"elasticsearch-es-default-0\" (UID: \"819af9fc-6db9-4743-bd06-f844f5ef5b0d\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 16 00:22:48 crc kubenswrapper[4816]: I0316 00:22:48.073673 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-volume\" (UniqueName: \"kubernetes.io/empty-dir/819af9fc-6db9-4743-bd06-f844f5ef5b0d-tmp-volume\") pod \"elasticsearch-es-default-0\" (UID: \"819af9fc-6db9-4743-bd06-f844f5ef5b0d\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 16 00:22:48 crc kubenswrapper[4816]: I0316 00:22:48.073751 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"elastic-internal-remote-certificate-authorities\" (UniqueName: \"kubernetes.io/secret/819af9fc-6db9-4743-bd06-f844f5ef5b0d-elastic-internal-remote-certificate-authorities\") pod \"elasticsearch-es-default-0\" (UID: \"819af9fc-6db9-4743-bd06-f844f5ef5b0d\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 16 00:22:48 crc kubenswrapper[4816]: I0316 00:22:48.073841 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"elasticsearch-logs\" (UniqueName: \"kubernetes.io/empty-dir/819af9fc-6db9-4743-bd06-f844f5ef5b0d-elasticsearch-logs\") pod \"elasticsearch-es-default-0\" (UID: \"819af9fc-6db9-4743-bd06-f844f5ef5b0d\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 16 00:22:48 crc kubenswrapper[4816]: I0316 00:22:48.073869 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"elastic-internal-transport-certificates\" (UniqueName: \"kubernetes.io/secret/819af9fc-6db9-4743-bd06-f844f5ef5b0d-elastic-internal-transport-certificates\") pod \"elasticsearch-es-default-0\" (UID: \"819af9fc-6db9-4743-bd06-f844f5ef5b0d\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 16 00:22:48 crc kubenswrapper[4816]: I0316 00:22:48.073897 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"elastic-internal-elasticsearch-config-local\" (UniqueName: \"kubernetes.io/empty-dir/819af9fc-6db9-4743-bd06-f844f5ef5b0d-elastic-internal-elasticsearch-config-local\") pod \"elasticsearch-es-default-0\" (UID: \"819af9fc-6db9-4743-bd06-f844f5ef5b0d\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 16 00:22:48 crc kubenswrapper[4816]: I0316 00:22:48.073931 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"elastic-internal-http-certificates\" (UniqueName: \"kubernetes.io/secret/819af9fc-6db9-4743-bd06-f844f5ef5b0d-elastic-internal-http-certificates\") pod \"elasticsearch-es-default-0\" (UID: \"819af9fc-6db9-4743-bd06-f844f5ef5b0d\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 16 00:22:48 crc kubenswrapper[4816]: I0316 00:22:48.074021 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"elastic-internal-xpack-file-realm\" (UniqueName: \"kubernetes.io/secret/819af9fc-6db9-4743-bd06-f844f5ef5b0d-elastic-internal-xpack-file-realm\") pod \"elasticsearch-es-default-0\" (UID: \"819af9fc-6db9-4743-bd06-f844f5ef5b0d\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 16 00:22:48 crc kubenswrapper[4816]: I0316 00:22:48.074076 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"elastic-internal-elasticsearch-bin-local\" (UniqueName: \"kubernetes.io/empty-dir/819af9fc-6db9-4743-bd06-f844f5ef5b0d-elastic-internal-elasticsearch-bin-local\") pod \"elasticsearch-es-default-0\" (UID: \"819af9fc-6db9-4743-bd06-f844f5ef5b0d\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 16 00:22:48 crc kubenswrapper[4816]: I0316 00:22:48.074108 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"downward-api\" (UniqueName: \"kubernetes.io/downward-api/819af9fc-6db9-4743-bd06-f844f5ef5b0d-downward-api\") pod \"elasticsearch-es-default-0\" (UID: \"819af9fc-6db9-4743-bd06-f844f5ef5b0d\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 16 00:22:48 crc kubenswrapper[4816]: I0316 00:22:48.074136 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"elastic-internal-elasticsearch-plugins-local\" (UniqueName: \"kubernetes.io/empty-dir/819af9fc-6db9-4743-bd06-f844f5ef5b0d-elastic-internal-elasticsearch-plugins-local\") pod \"elasticsearch-es-default-0\" (UID: \"819af9fc-6db9-4743-bd06-f844f5ef5b0d\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 16 00:22:48 crc kubenswrapper[4816]: I0316 00:22:48.074189 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"elastic-internal-unicast-hosts\" (UniqueName: \"kubernetes.io/configmap/819af9fc-6db9-4743-bd06-f844f5ef5b0d-elastic-internal-unicast-hosts\") pod \"elasticsearch-es-default-0\" (UID: \"819af9fc-6db9-4743-bd06-f844f5ef5b0d\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 16 00:22:48 crc kubenswrapper[4816]: I0316 00:22:48.074210 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"elasticsearch-data\" (UniqueName: \"kubernetes.io/empty-dir/819af9fc-6db9-4743-bd06-f844f5ef5b0d-elasticsearch-data\") pod \"elasticsearch-es-default-0\" (UID: \"819af9fc-6db9-4743-bd06-f844f5ef5b0d\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 16 00:22:48 crc kubenswrapper[4816]: I0316 00:22:48.176398 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"elastic-internal-unicast-hosts\" (UniqueName: \"kubernetes.io/configmap/819af9fc-6db9-4743-bd06-f844f5ef5b0d-elastic-internal-unicast-hosts\") pod \"elasticsearch-es-default-0\" (UID: \"819af9fc-6db9-4743-bd06-f844f5ef5b0d\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 16 00:22:48 crc kubenswrapper[4816]: I0316 00:22:48.176474 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"elasticsearch-data\" (UniqueName: \"kubernetes.io/empty-dir/819af9fc-6db9-4743-bd06-f844f5ef5b0d-elasticsearch-data\") pod \"elasticsearch-es-default-0\" (UID: \"819af9fc-6db9-4743-bd06-f844f5ef5b0d\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 16 00:22:48 crc kubenswrapper[4816]: I0316 00:22:48.176512 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"elastic-internal-probe-user\" (UniqueName: \"kubernetes.io/secret/819af9fc-6db9-4743-bd06-f844f5ef5b0d-elastic-internal-probe-user\") pod \"elasticsearch-es-default-0\" (UID: \"819af9fc-6db9-4743-bd06-f844f5ef5b0d\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 16 00:22:48 crc kubenswrapper[4816]: I0316 00:22:48.176575 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"elastic-internal-scripts\" (UniqueName: \"kubernetes.io/configmap/819af9fc-6db9-4743-bd06-f844f5ef5b0d-elastic-internal-scripts\") pod \"elasticsearch-es-default-0\" (UID: \"819af9fc-6db9-4743-bd06-f844f5ef5b0d\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 16 00:22:48 crc kubenswrapper[4816]: I0316 00:22:48.176628 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"elastic-internal-elasticsearch-config\" (UniqueName: \"kubernetes.io/secret/819af9fc-6db9-4743-bd06-f844f5ef5b0d-elastic-internal-elasticsearch-config\") pod \"elasticsearch-es-default-0\" (UID: \"819af9fc-6db9-4743-bd06-f844f5ef5b0d\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 16 00:22:48 crc kubenswrapper[4816]: I0316 00:22:48.176868 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmp-volume\" (UniqueName: \"kubernetes.io/empty-dir/819af9fc-6db9-4743-bd06-f844f5ef5b0d-tmp-volume\") pod \"elasticsearch-es-default-0\" (UID: \"819af9fc-6db9-4743-bd06-f844f5ef5b0d\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 16 00:22:48 crc kubenswrapper[4816]: I0316 00:22:48.176908 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"elastic-internal-remote-certificate-authorities\" (UniqueName: \"kubernetes.io/secret/819af9fc-6db9-4743-bd06-f844f5ef5b0d-elastic-internal-remote-certificate-authorities\") pod \"elasticsearch-es-default-0\" (UID: \"819af9fc-6db9-4743-bd06-f844f5ef5b0d\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 16 00:22:48 crc kubenswrapper[4816]: I0316 00:22:48.176955 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"elasticsearch-logs\" (UniqueName: \"kubernetes.io/empty-dir/819af9fc-6db9-4743-bd06-f844f5ef5b0d-elasticsearch-logs\") pod \"elasticsearch-es-default-0\" (UID: \"819af9fc-6db9-4743-bd06-f844f5ef5b0d\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 16 00:22:48 crc kubenswrapper[4816]: I0316 00:22:48.176983 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"elastic-internal-transport-certificates\" (UniqueName: \"kubernetes.io/secret/819af9fc-6db9-4743-bd06-f844f5ef5b0d-elastic-internal-transport-certificates\") pod \"elasticsearch-es-default-0\" (UID: \"819af9fc-6db9-4743-bd06-f844f5ef5b0d\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 16 00:22:48 crc kubenswrapper[4816]: I0316 00:22:48.177019 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"elastic-internal-elasticsearch-config-local\" (UniqueName: \"kubernetes.io/empty-dir/819af9fc-6db9-4743-bd06-f844f5ef5b0d-elastic-internal-elasticsearch-config-local\") pod \"elasticsearch-es-default-0\" (UID: \"819af9fc-6db9-4743-bd06-f844f5ef5b0d\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 16 00:22:48 crc kubenswrapper[4816]: I0316 00:22:48.177057 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"elastic-internal-http-certificates\" (UniqueName: \"kubernetes.io/secret/819af9fc-6db9-4743-bd06-f844f5ef5b0d-elastic-internal-http-certificates\") pod \"elasticsearch-es-default-0\" (UID: \"819af9fc-6db9-4743-bd06-f844f5ef5b0d\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 16 00:22:48 crc kubenswrapper[4816]: I0316 00:22:48.177086 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"elastic-internal-xpack-file-realm\" (UniqueName: \"kubernetes.io/secret/819af9fc-6db9-4743-bd06-f844f5ef5b0d-elastic-internal-xpack-file-realm\") pod \"elasticsearch-es-default-0\" (UID: \"819af9fc-6db9-4743-bd06-f844f5ef5b0d\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 16 00:22:48 crc kubenswrapper[4816]: I0316 00:22:48.177118 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"elastic-internal-elasticsearch-bin-local\" (UniqueName: \"kubernetes.io/empty-dir/819af9fc-6db9-4743-bd06-f844f5ef5b0d-elastic-internal-elasticsearch-bin-local\") pod \"elasticsearch-es-default-0\" (UID: \"819af9fc-6db9-4743-bd06-f844f5ef5b0d\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 16 00:22:48 crc kubenswrapper[4816]: I0316 00:22:48.177150 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"downward-api\" (UniqueName: \"kubernetes.io/downward-api/819af9fc-6db9-4743-bd06-f844f5ef5b0d-downward-api\") pod \"elasticsearch-es-default-0\" (UID: \"819af9fc-6db9-4743-bd06-f844f5ef5b0d\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 16 00:22:48 crc kubenswrapper[4816]: I0316 00:22:48.177179 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"elastic-internal-elasticsearch-plugins-local\" (UniqueName: \"kubernetes.io/empty-dir/819af9fc-6db9-4743-bd06-f844f5ef5b0d-elastic-internal-elasticsearch-plugins-local\") pod \"elasticsearch-es-default-0\" (UID: \"819af9fc-6db9-4743-bd06-f844f5ef5b0d\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 16 00:22:48 crc kubenswrapper[4816]: I0316 00:22:48.177303 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"elasticsearch-data\" (UniqueName: \"kubernetes.io/empty-dir/819af9fc-6db9-4743-bd06-f844f5ef5b0d-elasticsearch-data\") pod \"elasticsearch-es-default-0\" (UID: \"819af9fc-6db9-4743-bd06-f844f5ef5b0d\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 16 00:22:48 crc kubenswrapper[4816]: I0316 00:22:48.177678 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"elastic-internal-elasticsearch-config-local\" (UniqueName: \"kubernetes.io/empty-dir/819af9fc-6db9-4743-bd06-f844f5ef5b0d-elastic-internal-elasticsearch-config-local\") pod \"elasticsearch-es-default-0\" (UID: \"819af9fc-6db9-4743-bd06-f844f5ef5b0d\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 16 00:22:48 crc kubenswrapper[4816]: I0316 00:22:48.177956 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"elastic-internal-elasticsearch-plugins-local\" (UniqueName: \"kubernetes.io/empty-dir/819af9fc-6db9-4743-bd06-f844f5ef5b0d-elastic-internal-elasticsearch-plugins-local\") pod \"elasticsearch-es-default-0\" (UID: \"819af9fc-6db9-4743-bd06-f844f5ef5b0d\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 16 00:22:48 crc kubenswrapper[4816]: I0316 00:22:48.178056 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"elasticsearch-logs\" (UniqueName: \"kubernetes.io/empty-dir/819af9fc-6db9-4743-bd06-f844f5ef5b0d-elasticsearch-logs\") pod \"elasticsearch-es-default-0\" (UID: \"819af9fc-6db9-4743-bd06-f844f5ef5b0d\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 16 00:22:48 crc kubenswrapper[4816]: I0316 00:22:48.178477 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"elastic-internal-unicast-hosts\" (UniqueName: \"kubernetes.io/configmap/819af9fc-6db9-4743-bd06-f844f5ef5b0d-elastic-internal-unicast-hosts\") pod \"elasticsearch-es-default-0\" (UID: \"819af9fc-6db9-4743-bd06-f844f5ef5b0d\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 16 00:22:48 crc kubenswrapper[4816]: I0316 00:22:48.178661 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"elastic-internal-scripts\" (UniqueName: \"kubernetes.io/configmap/819af9fc-6db9-4743-bd06-f844f5ef5b0d-elastic-internal-scripts\") pod \"elasticsearch-es-default-0\" (UID: \"819af9fc-6db9-4743-bd06-f844f5ef5b0d\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 16 00:22:48 crc kubenswrapper[4816]: I0316 00:22:48.178824 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmp-volume\" (UniqueName: \"kubernetes.io/empty-dir/819af9fc-6db9-4743-bd06-f844f5ef5b0d-tmp-volume\") pod \"elasticsearch-es-default-0\" (UID: \"819af9fc-6db9-4743-bd06-f844f5ef5b0d\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 16 00:22:48 crc kubenswrapper[4816]: I0316 00:22:48.178999 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"elastic-internal-elasticsearch-bin-local\" (UniqueName: \"kubernetes.io/empty-dir/819af9fc-6db9-4743-bd06-f844f5ef5b0d-elastic-internal-elasticsearch-bin-local\") pod \"elasticsearch-es-default-0\" (UID: \"819af9fc-6db9-4743-bd06-f844f5ef5b0d\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 16 00:22:48 crc kubenswrapper[4816]: I0316 00:22:48.182476 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"elastic-internal-elasticsearch-config\" (UniqueName: \"kubernetes.io/secret/819af9fc-6db9-4743-bd06-f844f5ef5b0d-elastic-internal-elasticsearch-config\") pod \"elasticsearch-es-default-0\" (UID: \"819af9fc-6db9-4743-bd06-f844f5ef5b0d\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 16 00:22:48 crc kubenswrapper[4816]: I0316 00:22:48.182967 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"elastic-internal-probe-user\" (UniqueName: \"kubernetes.io/secret/819af9fc-6db9-4743-bd06-f844f5ef5b0d-elastic-internal-probe-user\") pod \"elasticsearch-es-default-0\" (UID: \"819af9fc-6db9-4743-bd06-f844f5ef5b0d\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 16 00:22:48 crc kubenswrapper[4816]: I0316 00:22:48.183988 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"downward-api\" (UniqueName: \"kubernetes.io/downward-api/819af9fc-6db9-4743-bd06-f844f5ef5b0d-downward-api\") pod \"elasticsearch-es-default-0\" (UID: \"819af9fc-6db9-4743-bd06-f844f5ef5b0d\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 16 00:22:48 crc kubenswrapper[4816]: I0316 00:22:48.184117 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"elastic-internal-remote-certificate-authorities\" (UniqueName: \"kubernetes.io/secret/819af9fc-6db9-4743-bd06-f844f5ef5b0d-elastic-internal-remote-certificate-authorities\") pod \"elasticsearch-es-default-0\" (UID: \"819af9fc-6db9-4743-bd06-f844f5ef5b0d\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 16 00:22:48 crc kubenswrapper[4816]: I0316 00:22:48.185244 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"elastic-internal-http-certificates\" (UniqueName: \"kubernetes.io/secret/819af9fc-6db9-4743-bd06-f844f5ef5b0d-elastic-internal-http-certificates\") pod \"elasticsearch-es-default-0\" (UID: \"819af9fc-6db9-4743-bd06-f844f5ef5b0d\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 16 00:22:48 crc kubenswrapper[4816]: I0316 00:22:48.186763 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"elastic-internal-xpack-file-realm\" (UniqueName: \"kubernetes.io/secret/819af9fc-6db9-4743-bd06-f844f5ef5b0d-elastic-internal-xpack-file-realm\") pod \"elasticsearch-es-default-0\" (UID: \"819af9fc-6db9-4743-bd06-f844f5ef5b0d\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 16 00:22:48 crc kubenswrapper[4816]: I0316 00:22:48.190496 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"elastic-internal-transport-certificates\" (UniqueName: \"kubernetes.io/secret/819af9fc-6db9-4743-bd06-f844f5ef5b0d-elastic-internal-transport-certificates\") pod \"elasticsearch-es-default-0\" (UID: \"819af9fc-6db9-4743-bd06-f844f5ef5b0d\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 16 00:22:48 crc kubenswrapper[4816]: I0316 00:22:48.264417 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/elasticsearch-es-default-0" Mar 16 00:22:52 crc kubenswrapper[4816]: I0316 00:22:52.075150 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/elasticsearch-es-default-0"] Mar 16 00:22:52 crc kubenswrapper[4816]: W0316 00:22:52.098728 4816 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod819af9fc_6db9_4743_bd06_f844f5ef5b0d.slice/crio-4adaa61025ba1e8ab254fb2d53985c4f897bee3baaba37099e987fb4605316da WatchSource:0}: Error finding container 4adaa61025ba1e8ab254fb2d53985c4f897bee3baaba37099e987fb4605316da: Status 404 returned error can't find the container with id 4adaa61025ba1e8ab254fb2d53985c4f897bee3baaba37099e987fb4605316da Mar 16 00:22:52 crc kubenswrapper[4816]: I0316 00:22:52.617765 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/elasticsearch-es-default-0" event={"ID":"819af9fc-6db9-4743-bd06-f844f5ef5b0d","Type":"ContainerStarted","Data":"4adaa61025ba1e8ab254fb2d53985c4f897bee3baaba37099e987fb4605316da"} Mar 16 00:22:52 crc kubenswrapper[4816]: I0316 00:22:52.618905 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/interconnect-operator-5bb49f789d-kd6nx" event={"ID":"f26ba6ee-c940-434d-80fe-81c813576ac9","Type":"ContainerStarted","Data":"b9e03202df033fa3e436ece2c2d18d48351c2b0554812fb3c9f999fac9ec3ca2"} Mar 16 00:22:52 crc kubenswrapper[4816]: I0316 00:22:52.621301 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-6d8d794bc-2s9sk" event={"ID":"9a808114-3164-4abe-a481-1b5d3b9df2a0","Type":"ContainerStarted","Data":"d58cfae0108d1b4f237761f15cab98577ff90c022b58bb47cf4c218838194f3e"} Mar 16 00:22:52 crc kubenswrapper[4816]: I0316 00:22:52.670513 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-admission-webhook-6d8d794bc-2s9sk" podStartSLOduration=-9223372008.184288 podStartE2EDuration="28.670487269s" podCreationTimestamp="2026-03-16 00:22:24 +0000 UTC" firstStartedPulling="2026-03-16 00:22:25.718897757 +0000 UTC m=+938.815197710" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-16 00:22:52.667150163 +0000 UTC m=+965.763450116" watchObservedRunningTime="2026-03-16 00:22:52.670487269 +0000 UTC m=+965.766787222" Mar 16 00:22:52 crc kubenswrapper[4816]: I0316 00:22:52.671566 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/interconnect-operator-5bb49f789d-kd6nx" podStartSLOduration=11.061688506 podStartE2EDuration="19.671542309s" podCreationTimestamp="2026-03-16 00:22:33 +0000 UTC" firstStartedPulling="2026-03-16 00:22:43.309492542 +0000 UTC m=+956.405792495" lastFinishedPulling="2026-03-16 00:22:51.919346345 +0000 UTC m=+965.015646298" observedRunningTime="2026-03-16 00:22:52.636691416 +0000 UTC m=+965.732991369" watchObservedRunningTime="2026-03-16 00:22:52.671542309 +0000 UTC m=+965.767842262" Mar 16 00:22:55 crc kubenswrapper[4816]: I0316 00:22:55.645467 4816 generic.go:334] "Generic (PLEG): container finished" podID="1da45fda-a8cc-46c1-8831-58418ecc9819" containerID="f0b6604af19fa4322ba98c463fa0ec289db6cb21f72ae73d9215646940dfdad1" exitCode=0 Mar 16 00:22:55 crc kubenswrapper[4816]: I0316 00:22:55.646515 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e59v87l" event={"ID":"1da45fda-a8cc-46c1-8831-58418ecc9819","Type":"ContainerDied","Data":"f0b6604af19fa4322ba98c463fa0ec289db6cb21f72ae73d9215646940dfdad1"} Mar 16 00:22:55 crc kubenswrapper[4816]: I0316 00:22:55.749667 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/perses-operator-5bf474d74f-t7w7m" Mar 16 00:22:56 crc kubenswrapper[4816]: I0316 00:22:56.658016 4816 generic.go:334] "Generic (PLEG): container finished" podID="1da45fda-a8cc-46c1-8831-58418ecc9819" containerID="7595ee672d5f6277a3926773b74d4e4c2739ea44ae1b6448b9f0308af4417a33" exitCode=0 Mar 16 00:22:56 crc kubenswrapper[4816]: I0316 00:22:56.658061 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e59v87l" event={"ID":"1da45fda-a8cc-46c1-8831-58418ecc9819","Type":"ContainerDied","Data":"7595ee672d5f6277a3926773b74d4e4c2739ea44ae1b6448b9f0308af4417a33"} Mar 16 00:23:02 crc kubenswrapper[4816]: I0316 00:23:02.791522 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e59v87l" Mar 16 00:23:02 crc kubenswrapper[4816]: I0316 00:23:02.892582 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/1da45fda-a8cc-46c1-8831-58418ecc9819-bundle\") pod \"1da45fda-a8cc-46c1-8831-58418ecc9819\" (UID: \"1da45fda-a8cc-46c1-8831-58418ecc9819\") " Mar 16 00:23:02 crc kubenswrapper[4816]: I0316 00:23:02.892653 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/1da45fda-a8cc-46c1-8831-58418ecc9819-util\") pod \"1da45fda-a8cc-46c1-8831-58418ecc9819\" (UID: \"1da45fda-a8cc-46c1-8831-58418ecc9819\") " Mar 16 00:23:02 crc kubenswrapper[4816]: I0316 00:23:02.892719 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4582c\" (UniqueName: \"kubernetes.io/projected/1da45fda-a8cc-46c1-8831-58418ecc9819-kube-api-access-4582c\") pod \"1da45fda-a8cc-46c1-8831-58418ecc9819\" (UID: \"1da45fda-a8cc-46c1-8831-58418ecc9819\") " Mar 16 00:23:02 crc kubenswrapper[4816]: I0316 00:23:02.893430 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1da45fda-a8cc-46c1-8831-58418ecc9819-bundle" (OuterVolumeSpecName: "bundle") pod "1da45fda-a8cc-46c1-8831-58418ecc9819" (UID: "1da45fda-a8cc-46c1-8831-58418ecc9819"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 16 00:23:02 crc kubenswrapper[4816]: I0316 00:23:02.898383 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1da45fda-a8cc-46c1-8831-58418ecc9819-kube-api-access-4582c" (OuterVolumeSpecName: "kube-api-access-4582c") pod "1da45fda-a8cc-46c1-8831-58418ecc9819" (UID: "1da45fda-a8cc-46c1-8831-58418ecc9819"). InnerVolumeSpecName "kube-api-access-4582c". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:23:02 crc kubenswrapper[4816]: I0316 00:23:02.904833 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1da45fda-a8cc-46c1-8831-58418ecc9819-util" (OuterVolumeSpecName: "util") pod "1da45fda-a8cc-46c1-8831-58418ecc9819" (UID: "1da45fda-a8cc-46c1-8831-58418ecc9819"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 16 00:23:02 crc kubenswrapper[4816]: I0316 00:23:02.993838 4816 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/1da45fda-a8cc-46c1-8831-58418ecc9819-util\") on node \"crc\" DevicePath \"\"" Mar 16 00:23:02 crc kubenswrapper[4816]: I0316 00:23:02.993879 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4582c\" (UniqueName: \"kubernetes.io/projected/1da45fda-a8cc-46c1-8831-58418ecc9819-kube-api-access-4582c\") on node \"crc\" DevicePath \"\"" Mar 16 00:23:02 crc kubenswrapper[4816]: I0316 00:23:02.993895 4816 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/1da45fda-a8cc-46c1-8831-58418ecc9819-bundle\") on node \"crc\" DevicePath \"\"" Mar 16 00:23:03 crc kubenswrapper[4816]: I0316 00:23:03.704934 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e59v87l" event={"ID":"1da45fda-a8cc-46c1-8831-58418ecc9819","Type":"ContainerDied","Data":"423ca09a07f2da9396c84b4219be8387d28d6dd64d1f4c92b01055a8dae546ea"} Mar 16 00:23:03 crc kubenswrapper[4816]: I0316 00:23:03.704977 4816 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="423ca09a07f2da9396c84b4219be8387d28d6dd64d1f4c92b01055a8dae546ea" Mar 16 00:23:03 crc kubenswrapper[4816]: I0316 00:23:03.705104 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e59v87l" Mar 16 00:23:03 crc kubenswrapper[4816]: I0316 00:23:03.727885 4816 scope.go:117] "RemoveContainer" containerID="185e1a33c845773d7893f16759f110b3a4a2b357c62cdafa5e5060cabc62a64e" Mar 16 00:23:04 crc kubenswrapper[4816]: I0316 00:23:04.712541 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/observability-operator-59bdc8b94-w6wv7" event={"ID":"8d0f60fa-8d26-43ea-a680-1d3a92dd270d","Type":"ContainerStarted","Data":"ebc2db9eb32f16fc38e87be7218d1a538aa38bbb30fda48350609a1429f10a8e"} Mar 16 00:23:04 crc kubenswrapper[4816]: I0316 00:23:04.713973 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/elasticsearch-es-default-0" event={"ID":"819af9fc-6db9-4743-bd06-f844f5ef5b0d","Type":"ContainerStarted","Data":"b584e1a8e4b0ac65a25b32d47ec6ced936ac543f56fdddb244f8dbc549daaeee"} Mar 16 00:23:04 crc kubenswrapper[4816]: I0316 00:23:04.714942 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operators/observability-operator-59bdc8b94-w6wv7" Mar 16 00:23:04 crc kubenswrapper[4816]: I0316 00:23:04.726546 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/observability-operator-59bdc8b94-w6wv7" Mar 16 00:23:04 crc kubenswrapper[4816]: I0316 00:23:04.783442 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/observability-operator-59bdc8b94-w6wv7" podStartSLOduration=1.779362475 podStartE2EDuration="39.783426578s" podCreationTimestamp="2026-03-16 00:22:25 +0000 UTC" firstStartedPulling="2026-03-16 00:22:25.967703389 +0000 UTC m=+939.064003342" lastFinishedPulling="2026-03-16 00:23:03.971767502 +0000 UTC m=+977.068067445" observedRunningTime="2026-03-16 00:23:04.780195375 +0000 UTC m=+977.876495328" watchObservedRunningTime="2026-03-16 00:23:04.783426578 +0000 UTC m=+977.879726531" Mar 16 00:23:04 crc kubenswrapper[4816]: I0316 00:23:04.878928 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/elasticsearch-es-default-0"] Mar 16 00:23:04 crc kubenswrapper[4816]: I0316 00:23:04.905254 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/elasticsearch-es-default-0"] Mar 16 00:23:06 crc kubenswrapper[4816]: I0316 00:23:06.723807 4816 generic.go:334] "Generic (PLEG): container finished" podID="819af9fc-6db9-4743-bd06-f844f5ef5b0d" containerID="b584e1a8e4b0ac65a25b32d47ec6ced936ac543f56fdddb244f8dbc549daaeee" exitCode=0 Mar 16 00:23:06 crc kubenswrapper[4816]: I0316 00:23:06.725214 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/elasticsearch-es-default-0" event={"ID":"819af9fc-6db9-4743-bd06-f844f5ef5b0d","Type":"ContainerDied","Data":"b584e1a8e4b0ac65a25b32d47ec6ced936ac543f56fdddb244f8dbc549daaeee"} Mar 16 00:23:12 crc kubenswrapper[4816]: I0316 00:23:12.756769 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/elasticsearch-es-default-0" event={"ID":"819af9fc-6db9-4743-bd06-f844f5ef5b0d","Type":"ContainerStarted","Data":"6c31b45225359d15c615de3dc1429eddcd946a9788d4ec8d328f458ff6087e54"} Mar 16 00:23:13 crc kubenswrapper[4816]: I0316 00:23:13.399221 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-5586865c96-74xcs"] Mar 16 00:23:13 crc kubenswrapper[4816]: E0316 00:23:13.399448 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1da45fda-a8cc-46c1-8831-58418ecc9819" containerName="util" Mar 16 00:23:13 crc kubenswrapper[4816]: I0316 00:23:13.399459 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="1da45fda-a8cc-46c1-8831-58418ecc9819" containerName="util" Mar 16 00:23:13 crc kubenswrapper[4816]: E0316 00:23:13.399472 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1da45fda-a8cc-46c1-8831-58418ecc9819" containerName="extract" Mar 16 00:23:13 crc kubenswrapper[4816]: I0316 00:23:13.399478 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="1da45fda-a8cc-46c1-8831-58418ecc9819" containerName="extract" Mar 16 00:23:13 crc kubenswrapper[4816]: E0316 00:23:13.399484 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1da45fda-a8cc-46c1-8831-58418ecc9819" containerName="pull" Mar 16 00:23:13 crc kubenswrapper[4816]: I0316 00:23:13.399490 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="1da45fda-a8cc-46c1-8831-58418ecc9819" containerName="pull" Mar 16 00:23:13 crc kubenswrapper[4816]: I0316 00:23:13.399590 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="1da45fda-a8cc-46c1-8831-58418ecc9819" containerName="extract" Mar 16 00:23:13 crc kubenswrapper[4816]: I0316 00:23:13.400013 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-5586865c96-74xcs" Mar 16 00:23:13 crc kubenswrapper[4816]: I0316 00:23:13.403647 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager-operator"/"kube-root-ca.crt" Mar 16 00:23:13 crc kubenswrapper[4816]: I0316 00:23:13.404175 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager-operator"/"openshift-service-ca.crt" Mar 16 00:23:13 crc kubenswrapper[4816]: I0316 00:23:13.404326 4816 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager-operator"/"cert-manager-operator-controller-manager-dockercfg-qnvfr" Mar 16 00:23:13 crc kubenswrapper[4816]: I0316 00:23:13.437965 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-5586865c96-74xcs"] Mar 16 00:23:13 crc kubenswrapper[4816]: I0316 00:23:13.527780 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rmxlz\" (UniqueName: \"kubernetes.io/projected/eb3fdaff-975a-4df2-a9f2-67b63b708615-kube-api-access-rmxlz\") pod \"cert-manager-operator-controller-manager-5586865c96-74xcs\" (UID: \"eb3fdaff-975a-4df2-a9f2-67b63b708615\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-5586865c96-74xcs" Mar 16 00:23:13 crc kubenswrapper[4816]: I0316 00:23:13.527854 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/eb3fdaff-975a-4df2-a9f2-67b63b708615-tmp\") pod \"cert-manager-operator-controller-manager-5586865c96-74xcs\" (UID: \"eb3fdaff-975a-4df2-a9f2-67b63b708615\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-5586865c96-74xcs" Mar 16 00:23:13 crc kubenswrapper[4816]: I0316 00:23:13.629031 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rmxlz\" (UniqueName: \"kubernetes.io/projected/eb3fdaff-975a-4df2-a9f2-67b63b708615-kube-api-access-rmxlz\") pod \"cert-manager-operator-controller-manager-5586865c96-74xcs\" (UID: \"eb3fdaff-975a-4df2-a9f2-67b63b708615\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-5586865c96-74xcs" Mar 16 00:23:13 crc kubenswrapper[4816]: I0316 00:23:13.629096 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/eb3fdaff-975a-4df2-a9f2-67b63b708615-tmp\") pod \"cert-manager-operator-controller-manager-5586865c96-74xcs\" (UID: \"eb3fdaff-975a-4df2-a9f2-67b63b708615\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-5586865c96-74xcs" Mar 16 00:23:13 crc kubenswrapper[4816]: I0316 00:23:13.629761 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/eb3fdaff-975a-4df2-a9f2-67b63b708615-tmp\") pod \"cert-manager-operator-controller-manager-5586865c96-74xcs\" (UID: \"eb3fdaff-975a-4df2-a9f2-67b63b708615\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-5586865c96-74xcs" Mar 16 00:23:13 crc kubenswrapper[4816]: I0316 00:23:13.649739 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rmxlz\" (UniqueName: \"kubernetes.io/projected/eb3fdaff-975a-4df2-a9f2-67b63b708615-kube-api-access-rmxlz\") pod \"cert-manager-operator-controller-manager-5586865c96-74xcs\" (UID: \"eb3fdaff-975a-4df2-a9f2-67b63b708615\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-5586865c96-74xcs" Mar 16 00:23:13 crc kubenswrapper[4816]: I0316 00:23:13.764229 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-5586865c96-74xcs" Mar 16 00:23:13 crc kubenswrapper[4816]: I0316 00:23:13.768933 4816 generic.go:334] "Generic (PLEG): container finished" podID="819af9fc-6db9-4743-bd06-f844f5ef5b0d" containerID="6c31b45225359d15c615de3dc1429eddcd946a9788d4ec8d328f458ff6087e54" exitCode=0 Mar 16 00:23:13 crc kubenswrapper[4816]: I0316 00:23:13.768990 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/elasticsearch-es-default-0" event={"ID":"819af9fc-6db9-4743-bd06-f844f5ef5b0d","Type":"ContainerDied","Data":"6c31b45225359d15c615de3dc1429eddcd946a9788d4ec8d328f458ff6087e54"} Mar 16 00:23:14 crc kubenswrapper[4816]: I0316 00:23:14.298488 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-5586865c96-74xcs"] Mar 16 00:23:14 crc kubenswrapper[4816]: I0316 00:23:14.775590 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-5586865c96-74xcs" event={"ID":"eb3fdaff-975a-4df2-a9f2-67b63b708615","Type":"ContainerStarted","Data":"f27a01e50df74c0eaacba0e1f44ee68a7de94b8d48de71b66a41e23589e4f2a6"} Mar 16 00:23:14 crc kubenswrapper[4816]: I0316 00:23:14.777758 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/elasticsearch-es-default-0" event={"ID":"819af9fc-6db9-4743-bd06-f844f5ef5b0d","Type":"ContainerStarted","Data":"bf1222611fb6e91e46e59464f518afd20a81523a36e5eac0ce8cf4090ae19ce7"} Mar 16 00:23:14 crc kubenswrapper[4816]: I0316 00:23:14.777929 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="service-telemetry/elasticsearch-es-default-0" Mar 16 00:23:14 crc kubenswrapper[4816]: I0316 00:23:14.809315 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/elasticsearch-es-default-0" podStartSLOduration=15.807364772 podStartE2EDuration="27.809294535s" podCreationTimestamp="2026-03-16 00:22:47 +0000 UTC" firstStartedPulling="2026-03-16 00:22:52.100172581 +0000 UTC m=+965.196472534" lastFinishedPulling="2026-03-16 00:23:04.102102334 +0000 UTC m=+977.198402297" observedRunningTime="2026-03-16 00:23:14.80462967 +0000 UTC m=+987.900929633" watchObservedRunningTime="2026-03-16 00:23:14.809294535 +0000 UTC m=+987.905594488" Mar 16 00:23:17 crc kubenswrapper[4816]: I0316 00:23:17.795075 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-5586865c96-74xcs" event={"ID":"eb3fdaff-975a-4df2-a9f2-67b63b708615","Type":"ContainerStarted","Data":"87d86e56cfeee83f609cc9971120bf75c47e31110d4c0a159b405590a73e8b2f"} Mar 16 00:23:17 crc kubenswrapper[4816]: I0316 00:23:17.822696 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager-operator/cert-manager-operator-controller-manager-5586865c96-74xcs" podStartSLOduration=1.679253431 podStartE2EDuration="4.822681462s" podCreationTimestamp="2026-03-16 00:23:13 +0000 UTC" firstStartedPulling="2026-03-16 00:23:14.301862057 +0000 UTC m=+987.398162010" lastFinishedPulling="2026-03-16 00:23:17.445290078 +0000 UTC m=+990.541590041" observedRunningTime="2026-03-16 00:23:17.817655357 +0000 UTC m=+990.913955310" watchObservedRunningTime="2026-03-16 00:23:17.822681462 +0000 UTC m=+990.918981415" Mar 16 00:23:20 crc kubenswrapper[4816]: I0316 00:23:20.945682 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-6888856db4-ssr4q"] Mar 16 00:23:20 crc kubenswrapper[4816]: I0316 00:23:20.947066 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-6888856db4-ssr4q" Mar 16 00:23:20 crc kubenswrapper[4816]: I0316 00:23:20.949156 4816 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-webhook-dockercfg-m24cv" Mar 16 00:23:20 crc kubenswrapper[4816]: I0316 00:23:20.959934 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"openshift-service-ca.crt" Mar 16 00:23:20 crc kubenswrapper[4816]: I0316 00:23:20.960681 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"kube-root-ca.crt" Mar 16 00:23:20 crc kubenswrapper[4816]: I0316 00:23:20.963982 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-6888856db4-ssr4q"] Mar 16 00:23:21 crc kubenswrapper[4816]: I0316 00:23:21.028069 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/ca67da37-05ff-4b13-aeea-04ac7f17ffc0-bound-sa-token\") pod \"cert-manager-webhook-6888856db4-ssr4q\" (UID: \"ca67da37-05ff-4b13-aeea-04ac7f17ffc0\") " pod="cert-manager/cert-manager-webhook-6888856db4-ssr4q" Mar 16 00:23:21 crc kubenswrapper[4816]: I0316 00:23:21.028179 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wnn6l\" (UniqueName: \"kubernetes.io/projected/ca67da37-05ff-4b13-aeea-04ac7f17ffc0-kube-api-access-wnn6l\") pod \"cert-manager-webhook-6888856db4-ssr4q\" (UID: \"ca67da37-05ff-4b13-aeea-04ac7f17ffc0\") " pod="cert-manager/cert-manager-webhook-6888856db4-ssr4q" Mar 16 00:23:21 crc kubenswrapper[4816]: I0316 00:23:21.129938 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wnn6l\" (UniqueName: \"kubernetes.io/projected/ca67da37-05ff-4b13-aeea-04ac7f17ffc0-kube-api-access-wnn6l\") pod \"cert-manager-webhook-6888856db4-ssr4q\" (UID: \"ca67da37-05ff-4b13-aeea-04ac7f17ffc0\") " pod="cert-manager/cert-manager-webhook-6888856db4-ssr4q" Mar 16 00:23:21 crc kubenswrapper[4816]: I0316 00:23:21.130286 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/ca67da37-05ff-4b13-aeea-04ac7f17ffc0-bound-sa-token\") pod \"cert-manager-webhook-6888856db4-ssr4q\" (UID: \"ca67da37-05ff-4b13-aeea-04ac7f17ffc0\") " pod="cert-manager/cert-manager-webhook-6888856db4-ssr4q" Mar 16 00:23:21 crc kubenswrapper[4816]: I0316 00:23:21.151495 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wnn6l\" (UniqueName: \"kubernetes.io/projected/ca67da37-05ff-4b13-aeea-04ac7f17ffc0-kube-api-access-wnn6l\") pod \"cert-manager-webhook-6888856db4-ssr4q\" (UID: \"ca67da37-05ff-4b13-aeea-04ac7f17ffc0\") " pod="cert-manager/cert-manager-webhook-6888856db4-ssr4q" Mar 16 00:23:21 crc kubenswrapper[4816]: I0316 00:23:21.154359 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/ca67da37-05ff-4b13-aeea-04ac7f17ffc0-bound-sa-token\") pod \"cert-manager-webhook-6888856db4-ssr4q\" (UID: \"ca67da37-05ff-4b13-aeea-04ac7f17ffc0\") " pod="cert-manager/cert-manager-webhook-6888856db4-ssr4q" Mar 16 00:23:21 crc kubenswrapper[4816]: I0316 00:23:21.264163 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-6888856db4-ssr4q" Mar 16 00:23:21 crc kubenswrapper[4816]: I0316 00:23:21.796690 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-6888856db4-ssr4q"] Mar 16 00:23:21 crc kubenswrapper[4816]: I0316 00:23:21.819893 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-6888856db4-ssr4q" event={"ID":"ca67da37-05ff-4b13-aeea-04ac7f17ffc0","Type":"ContainerStarted","Data":"41b69df87f03003db592c1f15b5da63cc8346c7d7995b899a24b7152a94405f1"} Mar 16 00:23:23 crc kubenswrapper[4816]: I0316 00:23:23.363859 4816 prober.go:107] "Probe failed" probeType="Readiness" pod="service-telemetry/elasticsearch-es-default-0" podUID="819af9fc-6db9-4743-bd06-f844f5ef5b0d" containerName="elasticsearch" probeResult="failure" output=< Mar 16 00:23:23 crc kubenswrapper[4816]: {"timestamp": "2026-03-16T00:23:23+00:00", "message": "readiness probe failed", "curl_rc": "7"} Mar 16 00:23:23 crc kubenswrapper[4816]: > Mar 16 00:23:24 crc kubenswrapper[4816]: I0316 00:23:24.747256 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-5545bd876-25jvg"] Mar 16 00:23:24 crc kubenswrapper[4816]: I0316 00:23:24.748861 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-5545bd876-25jvg" Mar 16 00:23:24 crc kubenswrapper[4816]: I0316 00:23:24.751866 4816 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-cainjector-dockercfg-2z8m6" Mar 16 00:23:24 crc kubenswrapper[4816]: I0316 00:23:24.776737 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-5545bd876-25jvg"] Mar 16 00:23:24 crc kubenswrapper[4816]: I0316 00:23:24.879952 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/fe81d263-aafd-4bdb-a088-d4bc52592a2d-bound-sa-token\") pod \"cert-manager-cainjector-5545bd876-25jvg\" (UID: \"fe81d263-aafd-4bdb-a088-d4bc52592a2d\") " pod="cert-manager/cert-manager-cainjector-5545bd876-25jvg" Mar 16 00:23:24 crc kubenswrapper[4816]: I0316 00:23:24.879999 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6gtkg\" (UniqueName: \"kubernetes.io/projected/fe81d263-aafd-4bdb-a088-d4bc52592a2d-kube-api-access-6gtkg\") pod \"cert-manager-cainjector-5545bd876-25jvg\" (UID: \"fe81d263-aafd-4bdb-a088-d4bc52592a2d\") " pod="cert-manager/cert-manager-cainjector-5545bd876-25jvg" Mar 16 00:23:24 crc kubenswrapper[4816]: I0316 00:23:24.981281 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/fe81d263-aafd-4bdb-a088-d4bc52592a2d-bound-sa-token\") pod \"cert-manager-cainjector-5545bd876-25jvg\" (UID: \"fe81d263-aafd-4bdb-a088-d4bc52592a2d\") " pod="cert-manager/cert-manager-cainjector-5545bd876-25jvg" Mar 16 00:23:24 crc kubenswrapper[4816]: I0316 00:23:24.981700 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6gtkg\" (UniqueName: \"kubernetes.io/projected/fe81d263-aafd-4bdb-a088-d4bc52592a2d-kube-api-access-6gtkg\") pod \"cert-manager-cainjector-5545bd876-25jvg\" (UID: \"fe81d263-aafd-4bdb-a088-d4bc52592a2d\") " pod="cert-manager/cert-manager-cainjector-5545bd876-25jvg" Mar 16 00:23:25 crc kubenswrapper[4816]: I0316 00:23:25.007966 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/fe81d263-aafd-4bdb-a088-d4bc52592a2d-bound-sa-token\") pod \"cert-manager-cainjector-5545bd876-25jvg\" (UID: \"fe81d263-aafd-4bdb-a088-d4bc52592a2d\") " pod="cert-manager/cert-manager-cainjector-5545bd876-25jvg" Mar 16 00:23:25 crc kubenswrapper[4816]: I0316 00:23:25.008086 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6gtkg\" (UniqueName: \"kubernetes.io/projected/fe81d263-aafd-4bdb-a088-d4bc52592a2d-kube-api-access-6gtkg\") pod \"cert-manager-cainjector-5545bd876-25jvg\" (UID: \"fe81d263-aafd-4bdb-a088-d4bc52592a2d\") " pod="cert-manager/cert-manager-cainjector-5545bd876-25jvg" Mar 16 00:23:25 crc kubenswrapper[4816]: I0316 00:23:25.074851 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-5545bd876-25jvg" Mar 16 00:23:25 crc kubenswrapper[4816]: I0316 00:23:25.285532 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/service-telemetry-operator-1-build"] Mar 16 00:23:25 crc kubenswrapper[4816]: I0316 00:23:25.289958 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-operator-1-build" Mar 16 00:23:25 crc kubenswrapper[4816]: I0316 00:23:25.294869 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/service-telemetry-operator-1-build"] Mar 16 00:23:25 crc kubenswrapper[4816]: I0316 00:23:25.295596 4816 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"builder-dockercfg-fs5z5" Mar 16 00:23:25 crc kubenswrapper[4816]: I0316 00:23:25.295695 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"service-telemetry-operator-1-global-ca" Mar 16 00:23:25 crc kubenswrapper[4816]: I0316 00:23:25.295770 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"service-telemetry-operator-1-ca" Mar 16 00:23:25 crc kubenswrapper[4816]: I0316 00:23:25.295846 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"service-telemetry-operator-1-sys-config" Mar 16 00:23:25 crc kubenswrapper[4816]: I0316 00:23:25.388617 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/1fe11315-8a31-4f80-b084-fdb8542e0074-build-blob-cache\") pod \"service-telemetry-operator-1-build\" (UID: \"1fe11315-8a31-4f80-b084-fdb8542e0074\") " pod="service-telemetry/service-telemetry-operator-1-build" Mar 16 00:23:25 crc kubenswrapper[4816]: I0316 00:23:25.388660 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/1fe11315-8a31-4f80-b084-fdb8542e0074-node-pullsecrets\") pod \"service-telemetry-operator-1-build\" (UID: \"1fe11315-8a31-4f80-b084-fdb8542e0074\") " pod="service-telemetry/service-telemetry-operator-1-build" Mar 16 00:23:25 crc kubenswrapper[4816]: I0316 00:23:25.388881 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/1fe11315-8a31-4f80-b084-fdb8542e0074-buildworkdir\") pod \"service-telemetry-operator-1-build\" (UID: \"1fe11315-8a31-4f80-b084-fdb8542e0074\") " pod="service-telemetry/service-telemetry-operator-1-build" Mar 16 00:23:25 crc kubenswrapper[4816]: I0316 00:23:25.388959 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/1fe11315-8a31-4f80-b084-fdb8542e0074-container-storage-root\") pod \"service-telemetry-operator-1-build\" (UID: \"1fe11315-8a31-4f80-b084-fdb8542e0074\") " pod="service-telemetry/service-telemetry-operator-1-build" Mar 16 00:23:25 crc kubenswrapper[4816]: I0316 00:23:25.389043 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/1fe11315-8a31-4f80-b084-fdb8542e0074-buildcachedir\") pod \"service-telemetry-operator-1-build\" (UID: \"1fe11315-8a31-4f80-b084-fdb8542e0074\") " pod="service-telemetry/service-telemetry-operator-1-build" Mar 16 00:23:25 crc kubenswrapper[4816]: I0316 00:23:25.389085 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/1fe11315-8a31-4f80-b084-fdb8542e0074-container-storage-run\") pod \"service-telemetry-operator-1-build\" (UID: \"1fe11315-8a31-4f80-b084-fdb8542e0074\") " pod="service-telemetry/service-telemetry-operator-1-build" Mar 16 00:23:25 crc kubenswrapper[4816]: I0316 00:23:25.389126 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/1fe11315-8a31-4f80-b084-fdb8542e0074-build-system-configs\") pod \"service-telemetry-operator-1-build\" (UID: \"1fe11315-8a31-4f80-b084-fdb8542e0074\") " pod="service-telemetry/service-telemetry-operator-1-build" Mar 16 00:23:25 crc kubenswrapper[4816]: I0316 00:23:25.389158 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wphcb\" (UniqueName: \"kubernetes.io/projected/1fe11315-8a31-4f80-b084-fdb8542e0074-kube-api-access-wphcb\") pod \"service-telemetry-operator-1-build\" (UID: \"1fe11315-8a31-4f80-b084-fdb8542e0074\") " pod="service-telemetry/service-telemetry-operator-1-build" Mar 16 00:23:25 crc kubenswrapper[4816]: I0316 00:23:25.389253 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-fs5z5-push\" (UniqueName: \"kubernetes.io/secret/1fe11315-8a31-4f80-b084-fdb8542e0074-builder-dockercfg-fs5z5-push\") pod \"service-telemetry-operator-1-build\" (UID: \"1fe11315-8a31-4f80-b084-fdb8542e0074\") " pod="service-telemetry/service-telemetry-operator-1-build" Mar 16 00:23:25 crc kubenswrapper[4816]: I0316 00:23:25.389288 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-fs5z5-pull\" (UniqueName: \"kubernetes.io/secret/1fe11315-8a31-4f80-b084-fdb8542e0074-builder-dockercfg-fs5z5-pull\") pod \"service-telemetry-operator-1-build\" (UID: \"1fe11315-8a31-4f80-b084-fdb8542e0074\") " pod="service-telemetry/service-telemetry-operator-1-build" Mar 16 00:23:25 crc kubenswrapper[4816]: I0316 00:23:25.389376 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/1fe11315-8a31-4f80-b084-fdb8542e0074-build-ca-bundles\") pod \"service-telemetry-operator-1-build\" (UID: \"1fe11315-8a31-4f80-b084-fdb8542e0074\") " pod="service-telemetry/service-telemetry-operator-1-build" Mar 16 00:23:25 crc kubenswrapper[4816]: I0316 00:23:25.389458 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/1fe11315-8a31-4f80-b084-fdb8542e0074-build-proxy-ca-bundles\") pod \"service-telemetry-operator-1-build\" (UID: \"1fe11315-8a31-4f80-b084-fdb8542e0074\") " pod="service-telemetry/service-telemetry-operator-1-build" Mar 16 00:23:25 crc kubenswrapper[4816]: I0316 00:23:25.491510 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/1fe11315-8a31-4f80-b084-fdb8542e0074-build-proxy-ca-bundles\") pod \"service-telemetry-operator-1-build\" (UID: \"1fe11315-8a31-4f80-b084-fdb8542e0074\") " pod="service-telemetry/service-telemetry-operator-1-build" Mar 16 00:23:25 crc kubenswrapper[4816]: I0316 00:23:25.491600 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/1fe11315-8a31-4f80-b084-fdb8542e0074-build-blob-cache\") pod \"service-telemetry-operator-1-build\" (UID: \"1fe11315-8a31-4f80-b084-fdb8542e0074\") " pod="service-telemetry/service-telemetry-operator-1-build" Mar 16 00:23:25 crc kubenswrapper[4816]: I0316 00:23:25.491629 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/1fe11315-8a31-4f80-b084-fdb8542e0074-node-pullsecrets\") pod \"service-telemetry-operator-1-build\" (UID: \"1fe11315-8a31-4f80-b084-fdb8542e0074\") " pod="service-telemetry/service-telemetry-operator-1-build" Mar 16 00:23:25 crc kubenswrapper[4816]: I0316 00:23:25.491666 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/1fe11315-8a31-4f80-b084-fdb8542e0074-buildworkdir\") pod \"service-telemetry-operator-1-build\" (UID: \"1fe11315-8a31-4f80-b084-fdb8542e0074\") " pod="service-telemetry/service-telemetry-operator-1-build" Mar 16 00:23:25 crc kubenswrapper[4816]: I0316 00:23:25.491690 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/1fe11315-8a31-4f80-b084-fdb8542e0074-container-storage-root\") pod \"service-telemetry-operator-1-build\" (UID: \"1fe11315-8a31-4f80-b084-fdb8542e0074\") " pod="service-telemetry/service-telemetry-operator-1-build" Mar 16 00:23:25 crc kubenswrapper[4816]: I0316 00:23:25.491734 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/1fe11315-8a31-4f80-b084-fdb8542e0074-buildcachedir\") pod \"service-telemetry-operator-1-build\" (UID: \"1fe11315-8a31-4f80-b084-fdb8542e0074\") " pod="service-telemetry/service-telemetry-operator-1-build" Mar 16 00:23:25 crc kubenswrapper[4816]: I0316 00:23:25.491763 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/1fe11315-8a31-4f80-b084-fdb8542e0074-container-storage-run\") pod \"service-telemetry-operator-1-build\" (UID: \"1fe11315-8a31-4f80-b084-fdb8542e0074\") " pod="service-telemetry/service-telemetry-operator-1-build" Mar 16 00:23:25 crc kubenswrapper[4816]: I0316 00:23:25.491799 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/1fe11315-8a31-4f80-b084-fdb8542e0074-build-system-configs\") pod \"service-telemetry-operator-1-build\" (UID: \"1fe11315-8a31-4f80-b084-fdb8542e0074\") " pod="service-telemetry/service-telemetry-operator-1-build" Mar 16 00:23:25 crc kubenswrapper[4816]: I0316 00:23:25.491823 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wphcb\" (UniqueName: \"kubernetes.io/projected/1fe11315-8a31-4f80-b084-fdb8542e0074-kube-api-access-wphcb\") pod \"service-telemetry-operator-1-build\" (UID: \"1fe11315-8a31-4f80-b084-fdb8542e0074\") " pod="service-telemetry/service-telemetry-operator-1-build" Mar 16 00:23:25 crc kubenswrapper[4816]: I0316 00:23:25.491860 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-fs5z5-push\" (UniqueName: \"kubernetes.io/secret/1fe11315-8a31-4f80-b084-fdb8542e0074-builder-dockercfg-fs5z5-push\") pod \"service-telemetry-operator-1-build\" (UID: \"1fe11315-8a31-4f80-b084-fdb8542e0074\") " pod="service-telemetry/service-telemetry-operator-1-build" Mar 16 00:23:25 crc kubenswrapper[4816]: I0316 00:23:25.491900 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-fs5z5-pull\" (UniqueName: \"kubernetes.io/secret/1fe11315-8a31-4f80-b084-fdb8542e0074-builder-dockercfg-fs5z5-pull\") pod \"service-telemetry-operator-1-build\" (UID: \"1fe11315-8a31-4f80-b084-fdb8542e0074\") " pod="service-telemetry/service-telemetry-operator-1-build" Mar 16 00:23:25 crc kubenswrapper[4816]: I0316 00:23:25.491931 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/1fe11315-8a31-4f80-b084-fdb8542e0074-build-ca-bundles\") pod \"service-telemetry-operator-1-build\" (UID: \"1fe11315-8a31-4f80-b084-fdb8542e0074\") " pod="service-telemetry/service-telemetry-operator-1-build" Mar 16 00:23:25 crc kubenswrapper[4816]: I0316 00:23:25.492113 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/1fe11315-8a31-4f80-b084-fdb8542e0074-buildworkdir\") pod \"service-telemetry-operator-1-build\" (UID: \"1fe11315-8a31-4f80-b084-fdb8542e0074\") " pod="service-telemetry/service-telemetry-operator-1-build" Mar 16 00:23:25 crc kubenswrapper[4816]: I0316 00:23:25.492283 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/1fe11315-8a31-4f80-b084-fdb8542e0074-build-proxy-ca-bundles\") pod \"service-telemetry-operator-1-build\" (UID: \"1fe11315-8a31-4f80-b084-fdb8542e0074\") " pod="service-telemetry/service-telemetry-operator-1-build" Mar 16 00:23:25 crc kubenswrapper[4816]: I0316 00:23:25.492346 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/1fe11315-8a31-4f80-b084-fdb8542e0074-buildcachedir\") pod \"service-telemetry-operator-1-build\" (UID: \"1fe11315-8a31-4f80-b084-fdb8542e0074\") " pod="service-telemetry/service-telemetry-operator-1-build" Mar 16 00:23:25 crc kubenswrapper[4816]: I0316 00:23:25.492532 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/1fe11315-8a31-4f80-b084-fdb8542e0074-container-storage-root\") pod \"service-telemetry-operator-1-build\" (UID: \"1fe11315-8a31-4f80-b084-fdb8542e0074\") " pod="service-telemetry/service-telemetry-operator-1-build" Mar 16 00:23:25 crc kubenswrapper[4816]: I0316 00:23:25.492789 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/1fe11315-8a31-4f80-b084-fdb8542e0074-build-system-configs\") pod \"service-telemetry-operator-1-build\" (UID: \"1fe11315-8a31-4f80-b084-fdb8542e0074\") " pod="service-telemetry/service-telemetry-operator-1-build" Mar 16 00:23:25 crc kubenswrapper[4816]: I0316 00:23:25.492821 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/1fe11315-8a31-4f80-b084-fdb8542e0074-build-ca-bundles\") pod \"service-telemetry-operator-1-build\" (UID: \"1fe11315-8a31-4f80-b084-fdb8542e0074\") " pod="service-telemetry/service-telemetry-operator-1-build" Mar 16 00:23:25 crc kubenswrapper[4816]: I0316 00:23:25.492952 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/1fe11315-8a31-4f80-b084-fdb8542e0074-build-blob-cache\") pod \"service-telemetry-operator-1-build\" (UID: \"1fe11315-8a31-4f80-b084-fdb8542e0074\") " pod="service-telemetry/service-telemetry-operator-1-build" Mar 16 00:23:25 crc kubenswrapper[4816]: I0316 00:23:25.493037 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/1fe11315-8a31-4f80-b084-fdb8542e0074-node-pullsecrets\") pod \"service-telemetry-operator-1-build\" (UID: \"1fe11315-8a31-4f80-b084-fdb8542e0074\") " pod="service-telemetry/service-telemetry-operator-1-build" Mar 16 00:23:25 crc kubenswrapper[4816]: I0316 00:23:25.493166 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/1fe11315-8a31-4f80-b084-fdb8542e0074-container-storage-run\") pod \"service-telemetry-operator-1-build\" (UID: \"1fe11315-8a31-4f80-b084-fdb8542e0074\") " pod="service-telemetry/service-telemetry-operator-1-build" Mar 16 00:23:25 crc kubenswrapper[4816]: I0316 00:23:25.498331 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-fs5z5-push\" (UniqueName: \"kubernetes.io/secret/1fe11315-8a31-4f80-b084-fdb8542e0074-builder-dockercfg-fs5z5-push\") pod \"service-telemetry-operator-1-build\" (UID: \"1fe11315-8a31-4f80-b084-fdb8542e0074\") " pod="service-telemetry/service-telemetry-operator-1-build" Mar 16 00:23:25 crc kubenswrapper[4816]: I0316 00:23:25.502340 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-fs5z5-pull\" (UniqueName: \"kubernetes.io/secret/1fe11315-8a31-4f80-b084-fdb8542e0074-builder-dockercfg-fs5z5-pull\") pod \"service-telemetry-operator-1-build\" (UID: \"1fe11315-8a31-4f80-b084-fdb8542e0074\") " pod="service-telemetry/service-telemetry-operator-1-build" Mar 16 00:23:25 crc kubenswrapper[4816]: I0316 00:23:25.510589 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wphcb\" (UniqueName: \"kubernetes.io/projected/1fe11315-8a31-4f80-b084-fdb8542e0074-kube-api-access-wphcb\") pod \"service-telemetry-operator-1-build\" (UID: \"1fe11315-8a31-4f80-b084-fdb8542e0074\") " pod="service-telemetry/service-telemetry-operator-1-build" Mar 16 00:23:25 crc kubenswrapper[4816]: I0316 00:23:25.607133 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-operator-1-build" Mar 16 00:23:26 crc kubenswrapper[4816]: I0316 00:23:26.868349 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-6888856db4-ssr4q" event={"ID":"ca67da37-05ff-4b13-aeea-04ac7f17ffc0","Type":"ContainerStarted","Data":"c208eaca8aeab9d2179d24b48a1ba299a7908a41d5c5a9debd4fcf20cd66187c"} Mar 16 00:23:26 crc kubenswrapper[4816]: I0316 00:23:26.868898 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="cert-manager/cert-manager-webhook-6888856db4-ssr4q" Mar 16 00:23:26 crc kubenswrapper[4816]: I0316 00:23:26.895339 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-6888856db4-ssr4q" podStartSLOduration=2.038849025 podStartE2EDuration="6.894646139s" podCreationTimestamp="2026-03-16 00:23:20 +0000 UTC" firstStartedPulling="2026-03-16 00:23:21.813393814 +0000 UTC m=+994.909693767" lastFinishedPulling="2026-03-16 00:23:26.669190928 +0000 UTC m=+999.765490881" observedRunningTime="2026-03-16 00:23:26.889726217 +0000 UTC m=+999.986026160" watchObservedRunningTime="2026-03-16 00:23:26.894646139 +0000 UTC m=+999.990946092" Mar 16 00:23:27 crc kubenswrapper[4816]: I0316 00:23:27.149097 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-5545bd876-25jvg"] Mar 16 00:23:27 crc kubenswrapper[4816]: W0316 00:23:27.154458 4816 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfe81d263_aafd_4bdb_a088_d4bc52592a2d.slice/crio-cea0786ba796e78188331f295762f145ba8b98d9895fd4c229d90ebecb590447 WatchSource:0}: Error finding container cea0786ba796e78188331f295762f145ba8b98d9895fd4c229d90ebecb590447: Status 404 returned error can't find the container with id cea0786ba796e78188331f295762f145ba8b98d9895fd4c229d90ebecb590447 Mar 16 00:23:27 crc kubenswrapper[4816]: I0316 00:23:27.155204 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/service-telemetry-operator-1-build"] Mar 16 00:23:27 crc kubenswrapper[4816]: I0316 00:23:27.875069 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-5545bd876-25jvg" event={"ID":"fe81d263-aafd-4bdb-a088-d4bc52592a2d","Type":"ContainerStarted","Data":"2dc9dc42025bc940a969c3b552ac8216b58ee0162daebc16d46303d8726b91bd"} Mar 16 00:23:27 crc kubenswrapper[4816]: I0316 00:23:27.875433 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-5545bd876-25jvg" event={"ID":"fe81d263-aafd-4bdb-a088-d4bc52592a2d","Type":"ContainerStarted","Data":"cea0786ba796e78188331f295762f145ba8b98d9895fd4c229d90ebecb590447"} Mar 16 00:23:27 crc kubenswrapper[4816]: I0316 00:23:27.876687 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-1-build" event={"ID":"1fe11315-8a31-4f80-b084-fdb8542e0074","Type":"ContainerStarted","Data":"be6f67c8612bf615c5637b63abc0ae83d660784155c8869bc8e23fccdb9f8c21"} Mar 16 00:23:27 crc kubenswrapper[4816]: I0316 00:23:27.893603 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-5545bd876-25jvg" podStartSLOduration=3.893579635 podStartE2EDuration="3.893579635s" podCreationTimestamp="2026-03-16 00:23:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-16 00:23:27.893558375 +0000 UTC m=+1000.989858328" watchObservedRunningTime="2026-03-16 00:23:27.893579635 +0000 UTC m=+1000.989879598" Mar 16 00:23:28 crc kubenswrapper[4816]: I0316 00:23:28.963260 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="service-telemetry/elasticsearch-es-default-0" Mar 16 00:23:31 crc kubenswrapper[4816]: I0316 00:23:31.269499 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-6888856db4-ssr4q" Mar 16 00:23:31 crc kubenswrapper[4816]: I0316 00:23:31.809912 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-545d4d4674-9q9nz"] Mar 16 00:23:31 crc kubenswrapper[4816]: I0316 00:23:31.810758 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-545d4d4674-9q9nz" Mar 16 00:23:31 crc kubenswrapper[4816]: I0316 00:23:31.819396 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-545d4d4674-9q9nz"] Mar 16 00:23:31 crc kubenswrapper[4816]: I0316 00:23:31.823358 4816 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-dockercfg-22czx" Mar 16 00:23:31 crc kubenswrapper[4816]: I0316 00:23:31.885230 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tkpxq\" (UniqueName: \"kubernetes.io/projected/88d51e1b-a795-4157-82b4-8a74d228e698-kube-api-access-tkpxq\") pod \"cert-manager-545d4d4674-9q9nz\" (UID: \"88d51e1b-a795-4157-82b4-8a74d228e698\") " pod="cert-manager/cert-manager-545d4d4674-9q9nz" Mar 16 00:23:31 crc kubenswrapper[4816]: I0316 00:23:31.885529 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/88d51e1b-a795-4157-82b4-8a74d228e698-bound-sa-token\") pod \"cert-manager-545d4d4674-9q9nz\" (UID: \"88d51e1b-a795-4157-82b4-8a74d228e698\") " pod="cert-manager/cert-manager-545d4d4674-9q9nz" Mar 16 00:23:31 crc kubenswrapper[4816]: I0316 00:23:31.987103 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tkpxq\" (UniqueName: \"kubernetes.io/projected/88d51e1b-a795-4157-82b4-8a74d228e698-kube-api-access-tkpxq\") pod \"cert-manager-545d4d4674-9q9nz\" (UID: \"88d51e1b-a795-4157-82b4-8a74d228e698\") " pod="cert-manager/cert-manager-545d4d4674-9q9nz" Mar 16 00:23:31 crc kubenswrapper[4816]: I0316 00:23:31.987171 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/88d51e1b-a795-4157-82b4-8a74d228e698-bound-sa-token\") pod \"cert-manager-545d4d4674-9q9nz\" (UID: \"88d51e1b-a795-4157-82b4-8a74d228e698\") " pod="cert-manager/cert-manager-545d4d4674-9q9nz" Mar 16 00:23:32 crc kubenswrapper[4816]: I0316 00:23:32.008392 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/88d51e1b-a795-4157-82b4-8a74d228e698-bound-sa-token\") pod \"cert-manager-545d4d4674-9q9nz\" (UID: \"88d51e1b-a795-4157-82b4-8a74d228e698\") " pod="cert-manager/cert-manager-545d4d4674-9q9nz" Mar 16 00:23:32 crc kubenswrapper[4816]: I0316 00:23:32.008994 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tkpxq\" (UniqueName: \"kubernetes.io/projected/88d51e1b-a795-4157-82b4-8a74d228e698-kube-api-access-tkpxq\") pod \"cert-manager-545d4d4674-9q9nz\" (UID: \"88d51e1b-a795-4157-82b4-8a74d228e698\") " pod="cert-manager/cert-manager-545d4d4674-9q9nz" Mar 16 00:23:32 crc kubenswrapper[4816]: I0316 00:23:32.132495 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-545d4d4674-9q9nz" Mar 16 00:23:34 crc kubenswrapper[4816]: I0316 00:23:34.079925 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-545d4d4674-9q9nz"] Mar 16 00:23:34 crc kubenswrapper[4816]: I0316 00:23:34.930781 4816 generic.go:334] "Generic (PLEG): container finished" podID="1fe11315-8a31-4f80-b084-fdb8542e0074" containerID="260d19a4e09f99c50d12150236c719bb65b9e9ace774386f70e495d800792e5c" exitCode=0 Mar 16 00:23:34 crc kubenswrapper[4816]: I0316 00:23:34.930865 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-1-build" event={"ID":"1fe11315-8a31-4f80-b084-fdb8542e0074","Type":"ContainerDied","Data":"260d19a4e09f99c50d12150236c719bb65b9e9ace774386f70e495d800792e5c"} Mar 16 00:23:34 crc kubenswrapper[4816]: I0316 00:23:34.934892 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-545d4d4674-9q9nz" event={"ID":"88d51e1b-a795-4157-82b4-8a74d228e698","Type":"ContainerStarted","Data":"9d677ad5f20392a8d6ce9b563ba14b7ad91b7a163a96a096480c28c9940d205d"} Mar 16 00:23:34 crc kubenswrapper[4816]: I0316 00:23:34.935500 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-545d4d4674-9q9nz" event={"ID":"88d51e1b-a795-4157-82b4-8a74d228e698","Type":"ContainerStarted","Data":"2da63d4a0826b4121db74dca7d4420ff200b18389893711ba7758a5be940edb4"} Mar 16 00:23:34 crc kubenswrapper[4816]: I0316 00:23:34.994459 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-545d4d4674-9q9nz" podStartSLOduration=3.99443789 podStartE2EDuration="3.99443789s" podCreationTimestamp="2026-03-16 00:23:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-16 00:23:34.990509906 +0000 UTC m=+1008.086809859" watchObservedRunningTime="2026-03-16 00:23:34.99443789 +0000 UTC m=+1008.090737863" Mar 16 00:23:35 crc kubenswrapper[4816]: I0316 00:23:35.351640 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["service-telemetry/service-telemetry-operator-1-build"] Mar 16 00:23:35 crc kubenswrapper[4816]: I0316 00:23:35.943846 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-1-build" event={"ID":"1fe11315-8a31-4f80-b084-fdb8542e0074","Type":"ContainerStarted","Data":"5ab87e5b52ff667c30c02fb846b96580f3a8bdb5023de19607d2656e42f7c8c7"} Mar 16 00:23:35 crc kubenswrapper[4816]: I0316 00:23:35.966034 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/service-telemetry-operator-1-build" podStartSLOduration=4.424430365 podStartE2EDuration="10.966017769s" podCreationTimestamp="2026-03-16 00:23:25 +0000 UTC" firstStartedPulling="2026-03-16 00:23:27.166835284 +0000 UTC m=+1000.263135237" lastFinishedPulling="2026-03-16 00:23:33.708422688 +0000 UTC m=+1006.804722641" observedRunningTime="2026-03-16 00:23:35.964492225 +0000 UTC m=+1009.060792178" watchObservedRunningTime="2026-03-16 00:23:35.966017769 +0000 UTC m=+1009.062317722" Mar 16 00:23:36 crc kubenswrapper[4816]: I0316 00:23:36.956920 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="service-telemetry/service-telemetry-operator-1-build" podUID="1fe11315-8a31-4f80-b084-fdb8542e0074" containerName="docker-build" containerID="cri-o://5ab87e5b52ff667c30c02fb846b96580f3a8bdb5023de19607d2656e42f7c8c7" gracePeriod=30 Mar 16 00:23:37 crc kubenswrapper[4816]: I0316 00:23:37.011446 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/service-telemetry-operator-2-build"] Mar 16 00:23:37 crc kubenswrapper[4816]: I0316 00:23:37.015097 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-operator-2-build" Mar 16 00:23:37 crc kubenswrapper[4816]: I0316 00:23:37.018018 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"service-telemetry-operator-2-sys-config" Mar 16 00:23:37 crc kubenswrapper[4816]: I0316 00:23:37.020165 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"service-telemetry-operator-2-ca" Mar 16 00:23:37 crc kubenswrapper[4816]: I0316 00:23:37.020540 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"service-telemetry-operator-2-global-ca" Mar 16 00:23:37 crc kubenswrapper[4816]: I0316 00:23:37.038872 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/service-telemetry-operator-2-build"] Mar 16 00:23:37 crc kubenswrapper[4816]: I0316 00:23:37.158978 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h7zgw\" (UniqueName: \"kubernetes.io/projected/a4629507-876a-405c-891c-5dcd521cf590-kube-api-access-h7zgw\") pod \"service-telemetry-operator-2-build\" (UID: \"a4629507-876a-405c-891c-5dcd521cf590\") " pod="service-telemetry/service-telemetry-operator-2-build" Mar 16 00:23:37 crc kubenswrapper[4816]: I0316 00:23:37.159041 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-fs5z5-pull\" (UniqueName: \"kubernetes.io/secret/a4629507-876a-405c-891c-5dcd521cf590-builder-dockercfg-fs5z5-pull\") pod \"service-telemetry-operator-2-build\" (UID: \"a4629507-876a-405c-891c-5dcd521cf590\") " pod="service-telemetry/service-telemetry-operator-2-build" Mar 16 00:23:37 crc kubenswrapper[4816]: I0316 00:23:37.159070 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/a4629507-876a-405c-891c-5dcd521cf590-build-proxy-ca-bundles\") pod \"service-telemetry-operator-2-build\" (UID: \"a4629507-876a-405c-891c-5dcd521cf590\") " pod="service-telemetry/service-telemetry-operator-2-build" Mar 16 00:23:37 crc kubenswrapper[4816]: I0316 00:23:37.159094 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/a4629507-876a-405c-891c-5dcd521cf590-node-pullsecrets\") pod \"service-telemetry-operator-2-build\" (UID: \"a4629507-876a-405c-891c-5dcd521cf590\") " pod="service-telemetry/service-telemetry-operator-2-build" Mar 16 00:23:37 crc kubenswrapper[4816]: I0316 00:23:37.159123 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/a4629507-876a-405c-891c-5dcd521cf590-container-storage-run\") pod \"service-telemetry-operator-2-build\" (UID: \"a4629507-876a-405c-891c-5dcd521cf590\") " pod="service-telemetry/service-telemetry-operator-2-build" Mar 16 00:23:37 crc kubenswrapper[4816]: I0316 00:23:37.159148 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/a4629507-876a-405c-891c-5dcd521cf590-build-system-configs\") pod \"service-telemetry-operator-2-build\" (UID: \"a4629507-876a-405c-891c-5dcd521cf590\") " pod="service-telemetry/service-telemetry-operator-2-build" Mar 16 00:23:37 crc kubenswrapper[4816]: I0316 00:23:37.159173 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/a4629507-876a-405c-891c-5dcd521cf590-build-ca-bundles\") pod \"service-telemetry-operator-2-build\" (UID: \"a4629507-876a-405c-891c-5dcd521cf590\") " pod="service-telemetry/service-telemetry-operator-2-build" Mar 16 00:23:37 crc kubenswrapper[4816]: I0316 00:23:37.159203 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-fs5z5-push\" (UniqueName: \"kubernetes.io/secret/a4629507-876a-405c-891c-5dcd521cf590-builder-dockercfg-fs5z5-push\") pod \"service-telemetry-operator-2-build\" (UID: \"a4629507-876a-405c-891c-5dcd521cf590\") " pod="service-telemetry/service-telemetry-operator-2-build" Mar 16 00:23:37 crc kubenswrapper[4816]: I0316 00:23:37.159267 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/a4629507-876a-405c-891c-5dcd521cf590-container-storage-root\") pod \"service-telemetry-operator-2-build\" (UID: \"a4629507-876a-405c-891c-5dcd521cf590\") " pod="service-telemetry/service-telemetry-operator-2-build" Mar 16 00:23:37 crc kubenswrapper[4816]: I0316 00:23:37.159348 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/a4629507-876a-405c-891c-5dcd521cf590-buildworkdir\") pod \"service-telemetry-operator-2-build\" (UID: \"a4629507-876a-405c-891c-5dcd521cf590\") " pod="service-telemetry/service-telemetry-operator-2-build" Mar 16 00:23:37 crc kubenswrapper[4816]: I0316 00:23:37.159386 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/a4629507-876a-405c-891c-5dcd521cf590-build-blob-cache\") pod \"service-telemetry-operator-2-build\" (UID: \"a4629507-876a-405c-891c-5dcd521cf590\") " pod="service-telemetry/service-telemetry-operator-2-build" Mar 16 00:23:37 crc kubenswrapper[4816]: I0316 00:23:37.159506 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/a4629507-876a-405c-891c-5dcd521cf590-buildcachedir\") pod \"service-telemetry-operator-2-build\" (UID: \"a4629507-876a-405c-891c-5dcd521cf590\") " pod="service-telemetry/service-telemetry-operator-2-build" Mar 16 00:23:37 crc kubenswrapper[4816]: I0316 00:23:37.261055 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-fs5z5-push\" (UniqueName: \"kubernetes.io/secret/a4629507-876a-405c-891c-5dcd521cf590-builder-dockercfg-fs5z5-push\") pod \"service-telemetry-operator-2-build\" (UID: \"a4629507-876a-405c-891c-5dcd521cf590\") " pod="service-telemetry/service-telemetry-operator-2-build" Mar 16 00:23:37 crc kubenswrapper[4816]: I0316 00:23:37.261403 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/a4629507-876a-405c-891c-5dcd521cf590-container-storage-root\") pod \"service-telemetry-operator-2-build\" (UID: \"a4629507-876a-405c-891c-5dcd521cf590\") " pod="service-telemetry/service-telemetry-operator-2-build" Mar 16 00:23:37 crc kubenswrapper[4816]: I0316 00:23:37.261531 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/a4629507-876a-405c-891c-5dcd521cf590-buildworkdir\") pod \"service-telemetry-operator-2-build\" (UID: \"a4629507-876a-405c-891c-5dcd521cf590\") " pod="service-telemetry/service-telemetry-operator-2-build" Mar 16 00:23:37 crc kubenswrapper[4816]: I0316 00:23:37.261707 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/a4629507-876a-405c-891c-5dcd521cf590-build-blob-cache\") pod \"service-telemetry-operator-2-build\" (UID: \"a4629507-876a-405c-891c-5dcd521cf590\") " pod="service-telemetry/service-telemetry-operator-2-build" Mar 16 00:23:37 crc kubenswrapper[4816]: I0316 00:23:37.261799 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/a4629507-876a-405c-891c-5dcd521cf590-container-storage-root\") pod \"service-telemetry-operator-2-build\" (UID: \"a4629507-876a-405c-891c-5dcd521cf590\") " pod="service-telemetry/service-telemetry-operator-2-build" Mar 16 00:23:37 crc kubenswrapper[4816]: I0316 00:23:37.262014 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/a4629507-876a-405c-891c-5dcd521cf590-buildcachedir\") pod \"service-telemetry-operator-2-build\" (UID: \"a4629507-876a-405c-891c-5dcd521cf590\") " pod="service-telemetry/service-telemetry-operator-2-build" Mar 16 00:23:37 crc kubenswrapper[4816]: I0316 00:23:37.262199 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h7zgw\" (UniqueName: \"kubernetes.io/projected/a4629507-876a-405c-891c-5dcd521cf590-kube-api-access-h7zgw\") pod \"service-telemetry-operator-2-build\" (UID: \"a4629507-876a-405c-891c-5dcd521cf590\") " pod="service-telemetry/service-telemetry-operator-2-build" Mar 16 00:23:37 crc kubenswrapper[4816]: I0316 00:23:37.262321 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/a4629507-876a-405c-891c-5dcd521cf590-build-blob-cache\") pod \"service-telemetry-operator-2-build\" (UID: \"a4629507-876a-405c-891c-5dcd521cf590\") " pod="service-telemetry/service-telemetry-operator-2-build" Mar 16 00:23:37 crc kubenswrapper[4816]: I0316 00:23:37.262044 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/a4629507-876a-405c-891c-5dcd521cf590-buildworkdir\") pod \"service-telemetry-operator-2-build\" (UID: \"a4629507-876a-405c-891c-5dcd521cf590\") " pod="service-telemetry/service-telemetry-operator-2-build" Mar 16 00:23:37 crc kubenswrapper[4816]: I0316 00:23:37.262092 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/a4629507-876a-405c-891c-5dcd521cf590-buildcachedir\") pod \"service-telemetry-operator-2-build\" (UID: \"a4629507-876a-405c-891c-5dcd521cf590\") " pod="service-telemetry/service-telemetry-operator-2-build" Mar 16 00:23:37 crc kubenswrapper[4816]: I0316 00:23:37.262642 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-fs5z5-pull\" (UniqueName: \"kubernetes.io/secret/a4629507-876a-405c-891c-5dcd521cf590-builder-dockercfg-fs5z5-pull\") pod \"service-telemetry-operator-2-build\" (UID: \"a4629507-876a-405c-891c-5dcd521cf590\") " pod="service-telemetry/service-telemetry-operator-2-build" Mar 16 00:23:37 crc kubenswrapper[4816]: I0316 00:23:37.262715 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/a4629507-876a-405c-891c-5dcd521cf590-build-proxy-ca-bundles\") pod \"service-telemetry-operator-2-build\" (UID: \"a4629507-876a-405c-891c-5dcd521cf590\") " pod="service-telemetry/service-telemetry-operator-2-build" Mar 16 00:23:37 crc kubenswrapper[4816]: I0316 00:23:37.262762 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/a4629507-876a-405c-891c-5dcd521cf590-node-pullsecrets\") pod \"service-telemetry-operator-2-build\" (UID: \"a4629507-876a-405c-891c-5dcd521cf590\") " pod="service-telemetry/service-telemetry-operator-2-build" Mar 16 00:23:37 crc kubenswrapper[4816]: I0316 00:23:37.262814 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/a4629507-876a-405c-891c-5dcd521cf590-container-storage-run\") pod \"service-telemetry-operator-2-build\" (UID: \"a4629507-876a-405c-891c-5dcd521cf590\") " pod="service-telemetry/service-telemetry-operator-2-build" Mar 16 00:23:37 crc kubenswrapper[4816]: I0316 00:23:37.262854 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/a4629507-876a-405c-891c-5dcd521cf590-build-system-configs\") pod \"service-telemetry-operator-2-build\" (UID: \"a4629507-876a-405c-891c-5dcd521cf590\") " pod="service-telemetry/service-telemetry-operator-2-build" Mar 16 00:23:37 crc kubenswrapper[4816]: I0316 00:23:37.262938 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/a4629507-876a-405c-891c-5dcd521cf590-node-pullsecrets\") pod \"service-telemetry-operator-2-build\" (UID: \"a4629507-876a-405c-891c-5dcd521cf590\") " pod="service-telemetry/service-telemetry-operator-2-build" Mar 16 00:23:37 crc kubenswrapper[4816]: I0316 00:23:37.262969 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/a4629507-876a-405c-891c-5dcd521cf590-build-ca-bundles\") pod \"service-telemetry-operator-2-build\" (UID: \"a4629507-876a-405c-891c-5dcd521cf590\") " pod="service-telemetry/service-telemetry-operator-2-build" Mar 16 00:23:37 crc kubenswrapper[4816]: I0316 00:23:37.263137 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/a4629507-876a-405c-891c-5dcd521cf590-container-storage-run\") pod \"service-telemetry-operator-2-build\" (UID: \"a4629507-876a-405c-891c-5dcd521cf590\") " pod="service-telemetry/service-telemetry-operator-2-build" Mar 16 00:23:37 crc kubenswrapper[4816]: I0316 00:23:37.263396 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/a4629507-876a-405c-891c-5dcd521cf590-build-proxy-ca-bundles\") pod \"service-telemetry-operator-2-build\" (UID: \"a4629507-876a-405c-891c-5dcd521cf590\") " pod="service-telemetry/service-telemetry-operator-2-build" Mar 16 00:23:37 crc kubenswrapper[4816]: I0316 00:23:37.263624 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/a4629507-876a-405c-891c-5dcd521cf590-build-system-configs\") pod \"service-telemetry-operator-2-build\" (UID: \"a4629507-876a-405c-891c-5dcd521cf590\") " pod="service-telemetry/service-telemetry-operator-2-build" Mar 16 00:23:37 crc kubenswrapper[4816]: I0316 00:23:37.264338 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/a4629507-876a-405c-891c-5dcd521cf590-build-ca-bundles\") pod \"service-telemetry-operator-2-build\" (UID: \"a4629507-876a-405c-891c-5dcd521cf590\") " pod="service-telemetry/service-telemetry-operator-2-build" Mar 16 00:23:37 crc kubenswrapper[4816]: I0316 00:23:37.269088 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-fs5z5-pull\" (UniqueName: \"kubernetes.io/secret/a4629507-876a-405c-891c-5dcd521cf590-builder-dockercfg-fs5z5-pull\") pod \"service-telemetry-operator-2-build\" (UID: \"a4629507-876a-405c-891c-5dcd521cf590\") " pod="service-telemetry/service-telemetry-operator-2-build" Mar 16 00:23:37 crc kubenswrapper[4816]: I0316 00:23:37.270091 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-fs5z5-push\" (UniqueName: \"kubernetes.io/secret/a4629507-876a-405c-891c-5dcd521cf590-builder-dockercfg-fs5z5-push\") pod \"service-telemetry-operator-2-build\" (UID: \"a4629507-876a-405c-891c-5dcd521cf590\") " pod="service-telemetry/service-telemetry-operator-2-build" Mar 16 00:23:37 crc kubenswrapper[4816]: I0316 00:23:37.276877 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h7zgw\" (UniqueName: \"kubernetes.io/projected/a4629507-876a-405c-891c-5dcd521cf590-kube-api-access-h7zgw\") pod \"service-telemetry-operator-2-build\" (UID: \"a4629507-876a-405c-891c-5dcd521cf590\") " pod="service-telemetry/service-telemetry-operator-2-build" Mar 16 00:23:37 crc kubenswrapper[4816]: I0316 00:23:37.332359 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-operator-2-build" Mar 16 00:23:38 crc kubenswrapper[4816]: I0316 00:23:38.319975 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/service-telemetry-operator-2-build"] Mar 16 00:23:38 crc kubenswrapper[4816]: I0316 00:23:38.970924 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-2-build" event={"ID":"a4629507-876a-405c-891c-5dcd521cf590","Type":"ContainerStarted","Data":"afc880cd89cb3d0dd5c36035edaf726279cc4d27b43fc12a6df286ecc563c314"} Mar 16 00:23:39 crc kubenswrapper[4816]: I0316 00:23:39.984119 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-2-build" event={"ID":"a4629507-876a-405c-891c-5dcd521cf590","Type":"ContainerStarted","Data":"863e0a61dd1fbf63d5e851bf29401be711920c970a57fdf0c47e4215e8849370"} Mar 16 00:23:39 crc kubenswrapper[4816]: I0316 00:23:39.991992 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_service-telemetry-operator-1-build_1fe11315-8a31-4f80-b084-fdb8542e0074/docker-build/0.log" Mar 16 00:23:39 crc kubenswrapper[4816]: I0316 00:23:39.997115 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-1-build" event={"ID":"1fe11315-8a31-4f80-b084-fdb8542e0074","Type":"ContainerDied","Data":"5ab87e5b52ff667c30c02fb846b96580f3a8bdb5023de19607d2656e42f7c8c7"} Mar 16 00:23:39 crc kubenswrapper[4816]: I0316 00:23:39.997333 4816 generic.go:334] "Generic (PLEG): container finished" podID="1fe11315-8a31-4f80-b084-fdb8542e0074" containerID="5ab87e5b52ff667c30c02fb846b96580f3a8bdb5023de19607d2656e42f7c8c7" exitCode=1 Mar 16 00:23:40 crc kubenswrapper[4816]: I0316 00:23:40.082352 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_service-telemetry-operator-1-build_1fe11315-8a31-4f80-b084-fdb8542e0074/docker-build/0.log" Mar 16 00:23:40 crc kubenswrapper[4816]: I0316 00:23:40.082643 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-operator-1-build" Mar 16 00:23:40 crc kubenswrapper[4816]: I0316 00:23:40.205165 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/1fe11315-8a31-4f80-b084-fdb8542e0074-node-pullsecrets\") pod \"1fe11315-8a31-4f80-b084-fdb8542e0074\" (UID: \"1fe11315-8a31-4f80-b084-fdb8542e0074\") " Mar 16 00:23:40 crc kubenswrapper[4816]: I0316 00:23:40.205268 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/1fe11315-8a31-4f80-b084-fdb8542e0074-buildworkdir\") pod \"1fe11315-8a31-4f80-b084-fdb8542e0074\" (UID: \"1fe11315-8a31-4f80-b084-fdb8542e0074\") " Mar 16 00:23:40 crc kubenswrapper[4816]: I0316 00:23:40.205326 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/1fe11315-8a31-4f80-b084-fdb8542e0074-container-storage-run\") pod \"1fe11315-8a31-4f80-b084-fdb8542e0074\" (UID: \"1fe11315-8a31-4f80-b084-fdb8542e0074\") " Mar 16 00:23:40 crc kubenswrapper[4816]: I0316 00:23:40.205362 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/1fe11315-8a31-4f80-b084-fdb8542e0074-build-ca-bundles\") pod \"1fe11315-8a31-4f80-b084-fdb8542e0074\" (UID: \"1fe11315-8a31-4f80-b084-fdb8542e0074\") " Mar 16 00:23:40 crc kubenswrapper[4816]: I0316 00:23:40.205404 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wphcb\" (UniqueName: \"kubernetes.io/projected/1fe11315-8a31-4f80-b084-fdb8542e0074-kube-api-access-wphcb\") pod \"1fe11315-8a31-4f80-b084-fdb8542e0074\" (UID: \"1fe11315-8a31-4f80-b084-fdb8542e0074\") " Mar 16 00:23:40 crc kubenswrapper[4816]: I0316 00:23:40.205451 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/1fe11315-8a31-4f80-b084-fdb8542e0074-container-storage-root\") pod \"1fe11315-8a31-4f80-b084-fdb8542e0074\" (UID: \"1fe11315-8a31-4f80-b084-fdb8542e0074\") " Mar 16 00:23:40 crc kubenswrapper[4816]: I0316 00:23:40.205524 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/1fe11315-8a31-4f80-b084-fdb8542e0074-build-proxy-ca-bundles\") pod \"1fe11315-8a31-4f80-b084-fdb8542e0074\" (UID: \"1fe11315-8a31-4f80-b084-fdb8542e0074\") " Mar 16 00:23:40 crc kubenswrapper[4816]: I0316 00:23:40.205610 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-fs5z5-pull\" (UniqueName: \"kubernetes.io/secret/1fe11315-8a31-4f80-b084-fdb8542e0074-builder-dockercfg-fs5z5-pull\") pod \"1fe11315-8a31-4f80-b084-fdb8542e0074\" (UID: \"1fe11315-8a31-4f80-b084-fdb8542e0074\") " Mar 16 00:23:40 crc kubenswrapper[4816]: I0316 00:23:40.205663 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/1fe11315-8a31-4f80-b084-fdb8542e0074-buildcachedir\") pod \"1fe11315-8a31-4f80-b084-fdb8542e0074\" (UID: \"1fe11315-8a31-4f80-b084-fdb8542e0074\") " Mar 16 00:23:40 crc kubenswrapper[4816]: I0316 00:23:40.205725 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-fs5z5-push\" (UniqueName: \"kubernetes.io/secret/1fe11315-8a31-4f80-b084-fdb8542e0074-builder-dockercfg-fs5z5-push\") pod \"1fe11315-8a31-4f80-b084-fdb8542e0074\" (UID: \"1fe11315-8a31-4f80-b084-fdb8542e0074\") " Mar 16 00:23:40 crc kubenswrapper[4816]: I0316 00:23:40.205765 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/1fe11315-8a31-4f80-b084-fdb8542e0074-build-blob-cache\") pod \"1fe11315-8a31-4f80-b084-fdb8542e0074\" (UID: \"1fe11315-8a31-4f80-b084-fdb8542e0074\") " Mar 16 00:23:40 crc kubenswrapper[4816]: I0316 00:23:40.205802 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/1fe11315-8a31-4f80-b084-fdb8542e0074-build-system-configs\") pod \"1fe11315-8a31-4f80-b084-fdb8542e0074\" (UID: \"1fe11315-8a31-4f80-b084-fdb8542e0074\") " Mar 16 00:23:40 crc kubenswrapper[4816]: I0316 00:23:40.205897 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1fe11315-8a31-4f80-b084-fdb8542e0074-buildcachedir" (OuterVolumeSpecName: "buildcachedir") pod "1fe11315-8a31-4f80-b084-fdb8542e0074" (UID: "1fe11315-8a31-4f80-b084-fdb8542e0074"). InnerVolumeSpecName "buildcachedir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 16 00:23:40 crc kubenswrapper[4816]: I0316 00:23:40.206152 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1fe11315-8a31-4f80-b084-fdb8542e0074-container-storage-run" (OuterVolumeSpecName: "container-storage-run") pod "1fe11315-8a31-4f80-b084-fdb8542e0074" (UID: "1fe11315-8a31-4f80-b084-fdb8542e0074"). InnerVolumeSpecName "container-storage-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 16 00:23:40 crc kubenswrapper[4816]: I0316 00:23:40.206385 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1fe11315-8a31-4f80-b084-fdb8542e0074-node-pullsecrets" (OuterVolumeSpecName: "node-pullsecrets") pod "1fe11315-8a31-4f80-b084-fdb8542e0074" (UID: "1fe11315-8a31-4f80-b084-fdb8542e0074"). InnerVolumeSpecName "node-pullsecrets". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 16 00:23:40 crc kubenswrapper[4816]: I0316 00:23:40.206405 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1fe11315-8a31-4f80-b084-fdb8542e0074-build-blob-cache" (OuterVolumeSpecName: "build-blob-cache") pod "1fe11315-8a31-4f80-b084-fdb8542e0074" (UID: "1fe11315-8a31-4f80-b084-fdb8542e0074"). InnerVolumeSpecName "build-blob-cache". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 16 00:23:40 crc kubenswrapper[4816]: I0316 00:23:40.206937 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1fe11315-8a31-4f80-b084-fdb8542e0074-build-system-configs" (OuterVolumeSpecName: "build-system-configs") pod "1fe11315-8a31-4f80-b084-fdb8542e0074" (UID: "1fe11315-8a31-4f80-b084-fdb8542e0074"). InnerVolumeSpecName "build-system-configs". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:23:40 crc kubenswrapper[4816]: I0316 00:23:40.207023 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1fe11315-8a31-4f80-b084-fdb8542e0074-build-proxy-ca-bundles" (OuterVolumeSpecName: "build-proxy-ca-bundles") pod "1fe11315-8a31-4f80-b084-fdb8542e0074" (UID: "1fe11315-8a31-4f80-b084-fdb8542e0074"). InnerVolumeSpecName "build-proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:23:40 crc kubenswrapper[4816]: I0316 00:23:40.207206 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1fe11315-8a31-4f80-b084-fdb8542e0074-build-ca-bundles" (OuterVolumeSpecName: "build-ca-bundles") pod "1fe11315-8a31-4f80-b084-fdb8542e0074" (UID: "1fe11315-8a31-4f80-b084-fdb8542e0074"). InnerVolumeSpecName "build-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:23:40 crc kubenswrapper[4816]: I0316 00:23:40.207209 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1fe11315-8a31-4f80-b084-fdb8542e0074-buildworkdir" (OuterVolumeSpecName: "buildworkdir") pod "1fe11315-8a31-4f80-b084-fdb8542e0074" (UID: "1fe11315-8a31-4f80-b084-fdb8542e0074"). InnerVolumeSpecName "buildworkdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 16 00:23:40 crc kubenswrapper[4816]: I0316 00:23:40.207758 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1fe11315-8a31-4f80-b084-fdb8542e0074-container-storage-root" (OuterVolumeSpecName: "container-storage-root") pod "1fe11315-8a31-4f80-b084-fdb8542e0074" (UID: "1fe11315-8a31-4f80-b084-fdb8542e0074"). InnerVolumeSpecName "container-storage-root". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 16 00:23:40 crc kubenswrapper[4816]: I0316 00:23:40.208139 4816 reconciler_common.go:293] "Volume detached for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/1fe11315-8a31-4f80-b084-fdb8542e0074-buildworkdir\") on node \"crc\" DevicePath \"\"" Mar 16 00:23:40 crc kubenswrapper[4816]: I0316 00:23:40.208260 4816 reconciler_common.go:293] "Volume detached for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/1fe11315-8a31-4f80-b084-fdb8542e0074-container-storage-run\") on node \"crc\" DevicePath \"\"" Mar 16 00:23:40 crc kubenswrapper[4816]: I0316 00:23:40.208341 4816 reconciler_common.go:293] "Volume detached for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/1fe11315-8a31-4f80-b084-fdb8542e0074-build-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 16 00:23:40 crc kubenswrapper[4816]: I0316 00:23:40.208425 4816 reconciler_common.go:293] "Volume detached for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/1fe11315-8a31-4f80-b084-fdb8542e0074-container-storage-root\") on node \"crc\" DevicePath \"\"" Mar 16 00:23:40 crc kubenswrapper[4816]: I0316 00:23:40.208500 4816 reconciler_common.go:293] "Volume detached for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/1fe11315-8a31-4f80-b084-fdb8542e0074-build-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 16 00:23:40 crc kubenswrapper[4816]: I0316 00:23:40.208691 4816 reconciler_common.go:293] "Volume detached for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/1fe11315-8a31-4f80-b084-fdb8542e0074-buildcachedir\") on node \"crc\" DevicePath \"\"" Mar 16 00:23:40 crc kubenswrapper[4816]: I0316 00:23:40.208798 4816 reconciler_common.go:293] "Volume detached for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/1fe11315-8a31-4f80-b084-fdb8542e0074-build-blob-cache\") on node \"crc\" DevicePath \"\"" Mar 16 00:23:40 crc kubenswrapper[4816]: I0316 00:23:40.208902 4816 reconciler_common.go:293] "Volume detached for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/1fe11315-8a31-4f80-b084-fdb8542e0074-build-system-configs\") on node \"crc\" DevicePath \"\"" Mar 16 00:23:40 crc kubenswrapper[4816]: I0316 00:23:40.208997 4816 reconciler_common.go:293] "Volume detached for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/1fe11315-8a31-4f80-b084-fdb8542e0074-node-pullsecrets\") on node \"crc\" DevicePath \"\"" Mar 16 00:23:40 crc kubenswrapper[4816]: I0316 00:23:40.217258 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1fe11315-8a31-4f80-b084-fdb8542e0074-builder-dockercfg-fs5z5-push" (OuterVolumeSpecName: "builder-dockercfg-fs5z5-push") pod "1fe11315-8a31-4f80-b084-fdb8542e0074" (UID: "1fe11315-8a31-4f80-b084-fdb8542e0074"). InnerVolumeSpecName "builder-dockercfg-fs5z5-push". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 16 00:23:40 crc kubenswrapper[4816]: I0316 00:23:40.220799 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1fe11315-8a31-4f80-b084-fdb8542e0074-kube-api-access-wphcb" (OuterVolumeSpecName: "kube-api-access-wphcb") pod "1fe11315-8a31-4f80-b084-fdb8542e0074" (UID: "1fe11315-8a31-4f80-b084-fdb8542e0074"). InnerVolumeSpecName "kube-api-access-wphcb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:23:40 crc kubenswrapper[4816]: I0316 00:23:40.222885 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1fe11315-8a31-4f80-b084-fdb8542e0074-builder-dockercfg-fs5z5-pull" (OuterVolumeSpecName: "builder-dockercfg-fs5z5-pull") pod "1fe11315-8a31-4f80-b084-fdb8542e0074" (UID: "1fe11315-8a31-4f80-b084-fdb8542e0074"). InnerVolumeSpecName "builder-dockercfg-fs5z5-pull". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 16 00:23:40 crc kubenswrapper[4816]: I0316 00:23:40.310324 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wphcb\" (UniqueName: \"kubernetes.io/projected/1fe11315-8a31-4f80-b084-fdb8542e0074-kube-api-access-wphcb\") on node \"crc\" DevicePath \"\"" Mar 16 00:23:40 crc kubenswrapper[4816]: I0316 00:23:40.310367 4816 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-fs5z5-pull\" (UniqueName: \"kubernetes.io/secret/1fe11315-8a31-4f80-b084-fdb8542e0074-builder-dockercfg-fs5z5-pull\") on node \"crc\" DevicePath \"\"" Mar 16 00:23:40 crc kubenswrapper[4816]: I0316 00:23:40.310380 4816 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-fs5z5-push\" (UniqueName: \"kubernetes.io/secret/1fe11315-8a31-4f80-b084-fdb8542e0074-builder-dockercfg-fs5z5-push\") on node \"crc\" DevicePath \"\"" Mar 16 00:23:41 crc kubenswrapper[4816]: I0316 00:23:41.005323 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_service-telemetry-operator-1-build_1fe11315-8a31-4f80-b084-fdb8542e0074/docker-build/0.log" Mar 16 00:23:41 crc kubenswrapper[4816]: I0316 00:23:41.006424 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-operator-1-build" Mar 16 00:23:41 crc kubenswrapper[4816]: I0316 00:23:41.006544 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-1-build" event={"ID":"1fe11315-8a31-4f80-b084-fdb8542e0074","Type":"ContainerDied","Data":"be6f67c8612bf615c5637b63abc0ae83d660784155c8869bc8e23fccdb9f8c21"} Mar 16 00:23:41 crc kubenswrapper[4816]: I0316 00:23:41.006698 4816 scope.go:117] "RemoveContainer" containerID="5ab87e5b52ff667c30c02fb846b96580f3a8bdb5023de19607d2656e42f7c8c7" Mar 16 00:23:41 crc kubenswrapper[4816]: I0316 00:23:41.029594 4816 scope.go:117] "RemoveContainer" containerID="260d19a4e09f99c50d12150236c719bb65b9e9ace774386f70e495d800792e5c" Mar 16 00:23:41 crc kubenswrapper[4816]: I0316 00:23:41.032956 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["service-telemetry/service-telemetry-operator-1-build"] Mar 16 00:23:41 crc kubenswrapper[4816]: I0316 00:23:41.043051 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["service-telemetry/service-telemetry-operator-1-build"] Mar 16 00:23:41 crc kubenswrapper[4816]: I0316 00:23:41.674707 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1fe11315-8a31-4f80-b084-fdb8542e0074" path="/var/lib/kubelet/pods/1fe11315-8a31-4f80-b084-fdb8542e0074/volumes" Mar 16 00:23:48 crc kubenswrapper[4816]: I0316 00:23:48.049173 4816 generic.go:334] "Generic (PLEG): container finished" podID="a4629507-876a-405c-891c-5dcd521cf590" containerID="863e0a61dd1fbf63d5e851bf29401be711920c970a57fdf0c47e4215e8849370" exitCode=0 Mar 16 00:23:48 crc kubenswrapper[4816]: I0316 00:23:48.049291 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-2-build" event={"ID":"a4629507-876a-405c-891c-5dcd521cf590","Type":"ContainerDied","Data":"863e0a61dd1fbf63d5e851bf29401be711920c970a57fdf0c47e4215e8849370"} Mar 16 00:23:49 crc kubenswrapper[4816]: I0316 00:23:49.059065 4816 generic.go:334] "Generic (PLEG): container finished" podID="a4629507-876a-405c-891c-5dcd521cf590" containerID="23e82f1c92387ad59df9c54ccbf20b2c5dd61bbb9ff88c126dddd725b46d94c0" exitCode=0 Mar 16 00:23:49 crc kubenswrapper[4816]: I0316 00:23:49.059122 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-2-build" event={"ID":"a4629507-876a-405c-891c-5dcd521cf590","Type":"ContainerDied","Data":"23e82f1c92387ad59df9c54ccbf20b2c5dd61bbb9ff88c126dddd725b46d94c0"} Mar 16 00:23:49 crc kubenswrapper[4816]: I0316 00:23:49.099538 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_service-telemetry-operator-2-build_a4629507-876a-405c-891c-5dcd521cf590/manage-dockerfile/0.log" Mar 16 00:23:50 crc kubenswrapper[4816]: I0316 00:23:50.067054 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-2-build" event={"ID":"a4629507-876a-405c-891c-5dcd521cf590","Type":"ContainerStarted","Data":"569da32a07b2521faaf5205b0d1082783f2685ea5a711b0eadcc5491dd41185a"} Mar 16 00:23:50 crc kubenswrapper[4816]: I0316 00:23:50.098575 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/service-telemetry-operator-2-build" podStartSLOduration=14.098513046 podStartE2EDuration="14.098513046s" podCreationTimestamp="2026-03-16 00:23:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-16 00:23:50.096288921 +0000 UTC m=+1023.192588884" watchObservedRunningTime="2026-03-16 00:23:50.098513046 +0000 UTC m=+1023.194813009" Mar 16 00:24:00 crc kubenswrapper[4816]: I0316 00:24:00.137147 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29560344-qmt9b"] Mar 16 00:24:00 crc kubenswrapper[4816]: E0316 00:24:00.139888 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1fe11315-8a31-4f80-b084-fdb8542e0074" containerName="manage-dockerfile" Mar 16 00:24:00 crc kubenswrapper[4816]: I0316 00:24:00.139905 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="1fe11315-8a31-4f80-b084-fdb8542e0074" containerName="manage-dockerfile" Mar 16 00:24:00 crc kubenswrapper[4816]: E0316 00:24:00.139932 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1fe11315-8a31-4f80-b084-fdb8542e0074" containerName="docker-build" Mar 16 00:24:00 crc kubenswrapper[4816]: I0316 00:24:00.139940 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="1fe11315-8a31-4f80-b084-fdb8542e0074" containerName="docker-build" Mar 16 00:24:00 crc kubenswrapper[4816]: I0316 00:24:00.140064 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="1fe11315-8a31-4f80-b084-fdb8542e0074" containerName="docker-build" Mar 16 00:24:00 crc kubenswrapper[4816]: I0316 00:24:00.140546 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29560344-qmt9b" Mar 16 00:24:00 crc kubenswrapper[4816]: I0316 00:24:00.143801 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29560344-qmt9b"] Mar 16 00:24:00 crc kubenswrapper[4816]: I0316 00:24:00.164644 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 16 00:24:00 crc kubenswrapper[4816]: I0316 00:24:00.164781 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 16 00:24:00 crc kubenswrapper[4816]: I0316 00:24:00.164954 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-8hc2r" Mar 16 00:24:00 crc kubenswrapper[4816]: I0316 00:24:00.266512 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ksgrh\" (UniqueName: \"kubernetes.io/projected/add0bb36-9ea9-49c0-9dc6-ba31c3cfcf2d-kube-api-access-ksgrh\") pod \"auto-csr-approver-29560344-qmt9b\" (UID: \"add0bb36-9ea9-49c0-9dc6-ba31c3cfcf2d\") " pod="openshift-infra/auto-csr-approver-29560344-qmt9b" Mar 16 00:24:00 crc kubenswrapper[4816]: I0316 00:24:00.367839 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ksgrh\" (UniqueName: \"kubernetes.io/projected/add0bb36-9ea9-49c0-9dc6-ba31c3cfcf2d-kube-api-access-ksgrh\") pod \"auto-csr-approver-29560344-qmt9b\" (UID: \"add0bb36-9ea9-49c0-9dc6-ba31c3cfcf2d\") " pod="openshift-infra/auto-csr-approver-29560344-qmt9b" Mar 16 00:24:00 crc kubenswrapper[4816]: I0316 00:24:00.386714 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ksgrh\" (UniqueName: \"kubernetes.io/projected/add0bb36-9ea9-49c0-9dc6-ba31c3cfcf2d-kube-api-access-ksgrh\") pod \"auto-csr-approver-29560344-qmt9b\" (UID: \"add0bb36-9ea9-49c0-9dc6-ba31c3cfcf2d\") " pod="openshift-infra/auto-csr-approver-29560344-qmt9b" Mar 16 00:24:00 crc kubenswrapper[4816]: I0316 00:24:00.486687 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29560344-qmt9b" Mar 16 00:24:00 crc kubenswrapper[4816]: I0316 00:24:00.726726 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29560344-qmt9b"] Mar 16 00:24:01 crc kubenswrapper[4816]: I0316 00:24:01.133921 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29560344-qmt9b" event={"ID":"add0bb36-9ea9-49c0-9dc6-ba31c3cfcf2d","Type":"ContainerStarted","Data":"31c29a72f860f347baf8d5002acc0aa9f6f3e1cd72c28219086ab49c38e3181a"} Mar 16 00:24:02 crc kubenswrapper[4816]: I0316 00:24:02.140510 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29560344-qmt9b" event={"ID":"add0bb36-9ea9-49c0-9dc6-ba31c3cfcf2d","Type":"ContainerStarted","Data":"6050167de1d894cd0016711271e17ed54f0e6320bd8403d36883159d39c3c966"} Mar 16 00:24:02 crc kubenswrapper[4816]: I0316 00:24:02.154608 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29560344-qmt9b" podStartSLOduration=1.22612965 podStartE2EDuration="2.154591898s" podCreationTimestamp="2026-03-16 00:24:00 +0000 UTC" firstStartedPulling="2026-03-16 00:24:00.750791386 +0000 UTC m=+1033.847091339" lastFinishedPulling="2026-03-16 00:24:01.679253634 +0000 UTC m=+1034.775553587" observedRunningTime="2026-03-16 00:24:02.154267938 +0000 UTC m=+1035.250567901" watchObservedRunningTime="2026-03-16 00:24:02.154591898 +0000 UTC m=+1035.250891851" Mar 16 00:24:03 crc kubenswrapper[4816]: I0316 00:24:03.146549 4816 generic.go:334] "Generic (PLEG): container finished" podID="add0bb36-9ea9-49c0-9dc6-ba31c3cfcf2d" containerID="6050167de1d894cd0016711271e17ed54f0e6320bd8403d36883159d39c3c966" exitCode=0 Mar 16 00:24:03 crc kubenswrapper[4816]: I0316 00:24:03.146604 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29560344-qmt9b" event={"ID":"add0bb36-9ea9-49c0-9dc6-ba31c3cfcf2d","Type":"ContainerDied","Data":"6050167de1d894cd0016711271e17ed54f0e6320bd8403d36883159d39c3c966"} Mar 16 00:24:04 crc kubenswrapper[4816]: I0316 00:24:04.382794 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29560344-qmt9b" Mar 16 00:24:04 crc kubenswrapper[4816]: I0316 00:24:04.552145 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ksgrh\" (UniqueName: \"kubernetes.io/projected/add0bb36-9ea9-49c0-9dc6-ba31c3cfcf2d-kube-api-access-ksgrh\") pod \"add0bb36-9ea9-49c0-9dc6-ba31c3cfcf2d\" (UID: \"add0bb36-9ea9-49c0-9dc6-ba31c3cfcf2d\") " Mar 16 00:24:04 crc kubenswrapper[4816]: I0316 00:24:04.557266 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/add0bb36-9ea9-49c0-9dc6-ba31c3cfcf2d-kube-api-access-ksgrh" (OuterVolumeSpecName: "kube-api-access-ksgrh") pod "add0bb36-9ea9-49c0-9dc6-ba31c3cfcf2d" (UID: "add0bb36-9ea9-49c0-9dc6-ba31c3cfcf2d"). InnerVolumeSpecName "kube-api-access-ksgrh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:24:04 crc kubenswrapper[4816]: I0316 00:24:04.653656 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ksgrh\" (UniqueName: \"kubernetes.io/projected/add0bb36-9ea9-49c0-9dc6-ba31c3cfcf2d-kube-api-access-ksgrh\") on node \"crc\" DevicePath \"\"" Mar 16 00:24:05 crc kubenswrapper[4816]: I0316 00:24:05.160461 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29560344-qmt9b" event={"ID":"add0bb36-9ea9-49c0-9dc6-ba31c3cfcf2d","Type":"ContainerDied","Data":"31c29a72f860f347baf8d5002acc0aa9f6f3e1cd72c28219086ab49c38e3181a"} Mar 16 00:24:05 crc kubenswrapper[4816]: I0316 00:24:05.160497 4816 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="31c29a72f860f347baf8d5002acc0aa9f6f3e1cd72c28219086ab49c38e3181a" Mar 16 00:24:05 crc kubenswrapper[4816]: I0316 00:24:05.160542 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29560344-qmt9b" Mar 16 00:24:05 crc kubenswrapper[4816]: I0316 00:24:05.215534 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29560338-8bkf9"] Mar 16 00:24:05 crc kubenswrapper[4816]: I0316 00:24:05.233573 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29560338-8bkf9"] Mar 16 00:24:05 crc kubenswrapper[4816]: I0316 00:24:05.675277 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6cfda38e-dbdc-4b42-8a0d-964103ee01cd" path="/var/lib/kubelet/pods/6cfda38e-dbdc-4b42-8a0d-964103ee01cd/volumes" Mar 16 00:24:41 crc kubenswrapper[4816]: I0316 00:24:41.572157 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-cql79"] Mar 16 00:24:41 crc kubenswrapper[4816]: E0316 00:24:41.572997 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="add0bb36-9ea9-49c0-9dc6-ba31c3cfcf2d" containerName="oc" Mar 16 00:24:41 crc kubenswrapper[4816]: I0316 00:24:41.573013 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="add0bb36-9ea9-49c0-9dc6-ba31c3cfcf2d" containerName="oc" Mar 16 00:24:41 crc kubenswrapper[4816]: I0316 00:24:41.573167 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="add0bb36-9ea9-49c0-9dc6-ba31c3cfcf2d" containerName="oc" Mar 16 00:24:41 crc kubenswrapper[4816]: I0316 00:24:41.574315 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-cql79" Mar 16 00:24:41 crc kubenswrapper[4816]: I0316 00:24:41.582846 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-cql79"] Mar 16 00:24:41 crc kubenswrapper[4816]: I0316 00:24:41.662715 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a4cadddc-b411-48ca-b4d3-dc7fdf9767dd-utilities\") pod \"community-operators-cql79\" (UID: \"a4cadddc-b411-48ca-b4d3-dc7fdf9767dd\") " pod="openshift-marketplace/community-operators-cql79" Mar 16 00:24:41 crc kubenswrapper[4816]: I0316 00:24:41.662773 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a4cadddc-b411-48ca-b4d3-dc7fdf9767dd-catalog-content\") pod \"community-operators-cql79\" (UID: \"a4cadddc-b411-48ca-b4d3-dc7fdf9767dd\") " pod="openshift-marketplace/community-operators-cql79" Mar 16 00:24:41 crc kubenswrapper[4816]: I0316 00:24:41.662845 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cl7m9\" (UniqueName: \"kubernetes.io/projected/a4cadddc-b411-48ca-b4d3-dc7fdf9767dd-kube-api-access-cl7m9\") pod \"community-operators-cql79\" (UID: \"a4cadddc-b411-48ca-b4d3-dc7fdf9767dd\") " pod="openshift-marketplace/community-operators-cql79" Mar 16 00:24:41 crc kubenswrapper[4816]: I0316 00:24:41.764257 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a4cadddc-b411-48ca-b4d3-dc7fdf9767dd-utilities\") pod \"community-operators-cql79\" (UID: \"a4cadddc-b411-48ca-b4d3-dc7fdf9767dd\") " pod="openshift-marketplace/community-operators-cql79" Mar 16 00:24:41 crc kubenswrapper[4816]: I0316 00:24:41.764305 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a4cadddc-b411-48ca-b4d3-dc7fdf9767dd-catalog-content\") pod \"community-operators-cql79\" (UID: \"a4cadddc-b411-48ca-b4d3-dc7fdf9767dd\") " pod="openshift-marketplace/community-operators-cql79" Mar 16 00:24:41 crc kubenswrapper[4816]: I0316 00:24:41.764351 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cl7m9\" (UniqueName: \"kubernetes.io/projected/a4cadddc-b411-48ca-b4d3-dc7fdf9767dd-kube-api-access-cl7m9\") pod \"community-operators-cql79\" (UID: \"a4cadddc-b411-48ca-b4d3-dc7fdf9767dd\") " pod="openshift-marketplace/community-operators-cql79" Mar 16 00:24:41 crc kubenswrapper[4816]: I0316 00:24:41.764824 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a4cadddc-b411-48ca-b4d3-dc7fdf9767dd-utilities\") pod \"community-operators-cql79\" (UID: \"a4cadddc-b411-48ca-b4d3-dc7fdf9767dd\") " pod="openshift-marketplace/community-operators-cql79" Mar 16 00:24:41 crc kubenswrapper[4816]: I0316 00:24:41.764909 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a4cadddc-b411-48ca-b4d3-dc7fdf9767dd-catalog-content\") pod \"community-operators-cql79\" (UID: \"a4cadddc-b411-48ca-b4d3-dc7fdf9767dd\") " pod="openshift-marketplace/community-operators-cql79" Mar 16 00:24:41 crc kubenswrapper[4816]: I0316 00:24:41.783220 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cl7m9\" (UniqueName: \"kubernetes.io/projected/a4cadddc-b411-48ca-b4d3-dc7fdf9767dd-kube-api-access-cl7m9\") pod \"community-operators-cql79\" (UID: \"a4cadddc-b411-48ca-b4d3-dc7fdf9767dd\") " pod="openshift-marketplace/community-operators-cql79" Mar 16 00:24:41 crc kubenswrapper[4816]: I0316 00:24:41.904295 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-cql79" Mar 16 00:24:42 crc kubenswrapper[4816]: I0316 00:24:42.402516 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-cql79"] Mar 16 00:24:43 crc kubenswrapper[4816]: I0316 00:24:43.394461 4816 generic.go:334] "Generic (PLEG): container finished" podID="a4cadddc-b411-48ca-b4d3-dc7fdf9767dd" containerID="9917112c1913ecdfd392c35ecd560b26665b5ee21c579ae017d9f0c4b26e303d" exitCode=0 Mar 16 00:24:43 crc kubenswrapper[4816]: I0316 00:24:43.394531 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-cql79" event={"ID":"a4cadddc-b411-48ca-b4d3-dc7fdf9767dd","Type":"ContainerDied","Data":"9917112c1913ecdfd392c35ecd560b26665b5ee21c579ae017d9f0c4b26e303d"} Mar 16 00:24:43 crc kubenswrapper[4816]: I0316 00:24:43.394872 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-cql79" event={"ID":"a4cadddc-b411-48ca-b4d3-dc7fdf9767dd","Type":"ContainerStarted","Data":"7206621fed299d699bb828068794b51424cc6881ba7958e6018750c2a55ad6a7"} Mar 16 00:24:45 crc kubenswrapper[4816]: I0316 00:24:45.408776 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-cql79" event={"ID":"a4cadddc-b411-48ca-b4d3-dc7fdf9767dd","Type":"ContainerStarted","Data":"90565ad1dfc64c2cda6b92e812af328722e858d91cfadac9340c716d273d76ff"} Mar 16 00:24:46 crc kubenswrapper[4816]: I0316 00:24:46.417998 4816 generic.go:334] "Generic (PLEG): container finished" podID="a4cadddc-b411-48ca-b4d3-dc7fdf9767dd" containerID="90565ad1dfc64c2cda6b92e812af328722e858d91cfadac9340c716d273d76ff" exitCode=0 Mar 16 00:24:46 crc kubenswrapper[4816]: I0316 00:24:46.418069 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-cql79" event={"ID":"a4cadddc-b411-48ca-b4d3-dc7fdf9767dd","Type":"ContainerDied","Data":"90565ad1dfc64c2cda6b92e812af328722e858d91cfadac9340c716d273d76ff"} Mar 16 00:24:47 crc kubenswrapper[4816]: I0316 00:24:47.425882 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-cql79" event={"ID":"a4cadddc-b411-48ca-b4d3-dc7fdf9767dd","Type":"ContainerStarted","Data":"3b262fd21eee0d19abdc5924d306ef1a5f47e904335e542336dc28173c0a4a65"} Mar 16 00:24:47 crc kubenswrapper[4816]: I0316 00:24:47.446666 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-cql79" podStartSLOduration=2.9410904540000002 podStartE2EDuration="6.446645611s" podCreationTimestamp="2026-03-16 00:24:41 +0000 UTC" firstStartedPulling="2026-03-16 00:24:43.39802039 +0000 UTC m=+1076.494320343" lastFinishedPulling="2026-03-16 00:24:46.903575547 +0000 UTC m=+1079.999875500" observedRunningTime="2026-03-16 00:24:47.442645932 +0000 UTC m=+1080.538945885" watchObservedRunningTime="2026-03-16 00:24:47.446645611 +0000 UTC m=+1080.542945574" Mar 16 00:24:51 crc kubenswrapper[4816]: I0316 00:24:51.904932 4816 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-cql79" Mar 16 00:24:51 crc kubenswrapper[4816]: I0316 00:24:51.905686 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-cql79" Mar 16 00:24:51 crc kubenswrapper[4816]: I0316 00:24:51.949241 4816 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-cql79" Mar 16 00:24:52 crc kubenswrapper[4816]: I0316 00:24:52.499349 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-cql79" Mar 16 00:24:52 crc kubenswrapper[4816]: I0316 00:24:52.540191 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-cql79"] Mar 16 00:24:54 crc kubenswrapper[4816]: I0316 00:24:54.471852 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-cql79" podUID="a4cadddc-b411-48ca-b4d3-dc7fdf9767dd" containerName="registry-server" containerID="cri-o://3b262fd21eee0d19abdc5924d306ef1a5f47e904335e542336dc28173c0a4a65" gracePeriod=2 Mar 16 00:24:54 crc kubenswrapper[4816]: I0316 00:24:54.847399 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-cql79" Mar 16 00:24:55 crc kubenswrapper[4816]: I0316 00:24:55.032537 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a4cadddc-b411-48ca-b4d3-dc7fdf9767dd-catalog-content\") pod \"a4cadddc-b411-48ca-b4d3-dc7fdf9767dd\" (UID: \"a4cadddc-b411-48ca-b4d3-dc7fdf9767dd\") " Mar 16 00:24:55 crc kubenswrapper[4816]: I0316 00:24:55.032655 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a4cadddc-b411-48ca-b4d3-dc7fdf9767dd-utilities\") pod \"a4cadddc-b411-48ca-b4d3-dc7fdf9767dd\" (UID: \"a4cadddc-b411-48ca-b4d3-dc7fdf9767dd\") " Mar 16 00:24:55 crc kubenswrapper[4816]: I0316 00:24:55.032753 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cl7m9\" (UniqueName: \"kubernetes.io/projected/a4cadddc-b411-48ca-b4d3-dc7fdf9767dd-kube-api-access-cl7m9\") pod \"a4cadddc-b411-48ca-b4d3-dc7fdf9767dd\" (UID: \"a4cadddc-b411-48ca-b4d3-dc7fdf9767dd\") " Mar 16 00:24:55 crc kubenswrapper[4816]: I0316 00:24:55.033604 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a4cadddc-b411-48ca-b4d3-dc7fdf9767dd-utilities" (OuterVolumeSpecName: "utilities") pod "a4cadddc-b411-48ca-b4d3-dc7fdf9767dd" (UID: "a4cadddc-b411-48ca-b4d3-dc7fdf9767dd"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 16 00:24:55 crc kubenswrapper[4816]: I0316 00:24:55.037822 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a4cadddc-b411-48ca-b4d3-dc7fdf9767dd-kube-api-access-cl7m9" (OuterVolumeSpecName: "kube-api-access-cl7m9") pod "a4cadddc-b411-48ca-b4d3-dc7fdf9767dd" (UID: "a4cadddc-b411-48ca-b4d3-dc7fdf9767dd"). InnerVolumeSpecName "kube-api-access-cl7m9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:24:55 crc kubenswrapper[4816]: I0316 00:24:55.099054 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a4cadddc-b411-48ca-b4d3-dc7fdf9767dd-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a4cadddc-b411-48ca-b4d3-dc7fdf9767dd" (UID: "a4cadddc-b411-48ca-b4d3-dc7fdf9767dd"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 16 00:24:55 crc kubenswrapper[4816]: I0316 00:24:55.134620 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cl7m9\" (UniqueName: \"kubernetes.io/projected/a4cadddc-b411-48ca-b4d3-dc7fdf9767dd-kube-api-access-cl7m9\") on node \"crc\" DevicePath \"\"" Mar 16 00:24:55 crc kubenswrapper[4816]: I0316 00:24:55.134661 4816 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a4cadddc-b411-48ca-b4d3-dc7fdf9767dd-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 16 00:24:55 crc kubenswrapper[4816]: I0316 00:24:55.134674 4816 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a4cadddc-b411-48ca-b4d3-dc7fdf9767dd-utilities\") on node \"crc\" DevicePath \"\"" Mar 16 00:24:55 crc kubenswrapper[4816]: I0316 00:24:55.479956 4816 generic.go:334] "Generic (PLEG): container finished" podID="a4cadddc-b411-48ca-b4d3-dc7fdf9767dd" containerID="3b262fd21eee0d19abdc5924d306ef1a5f47e904335e542336dc28173c0a4a65" exitCode=0 Mar 16 00:24:55 crc kubenswrapper[4816]: I0316 00:24:55.480029 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-cql79" Mar 16 00:24:55 crc kubenswrapper[4816]: I0316 00:24:55.480018 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-cql79" event={"ID":"a4cadddc-b411-48ca-b4d3-dc7fdf9767dd","Type":"ContainerDied","Data":"3b262fd21eee0d19abdc5924d306ef1a5f47e904335e542336dc28173c0a4a65"} Mar 16 00:24:55 crc kubenswrapper[4816]: I0316 00:24:55.480176 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-cql79" event={"ID":"a4cadddc-b411-48ca-b4d3-dc7fdf9767dd","Type":"ContainerDied","Data":"7206621fed299d699bb828068794b51424cc6881ba7958e6018750c2a55ad6a7"} Mar 16 00:24:55 crc kubenswrapper[4816]: I0316 00:24:55.480200 4816 scope.go:117] "RemoveContainer" containerID="3b262fd21eee0d19abdc5924d306ef1a5f47e904335e542336dc28173c0a4a65" Mar 16 00:24:55 crc kubenswrapper[4816]: I0316 00:24:55.495132 4816 scope.go:117] "RemoveContainer" containerID="90565ad1dfc64c2cda6b92e812af328722e858d91cfadac9340c716d273d76ff" Mar 16 00:24:55 crc kubenswrapper[4816]: I0316 00:24:55.508897 4816 scope.go:117] "RemoveContainer" containerID="9917112c1913ecdfd392c35ecd560b26665b5ee21c579ae017d9f0c4b26e303d" Mar 16 00:24:55 crc kubenswrapper[4816]: I0316 00:24:55.515273 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-cql79"] Mar 16 00:24:55 crc kubenswrapper[4816]: I0316 00:24:55.522671 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-cql79"] Mar 16 00:24:55 crc kubenswrapper[4816]: I0316 00:24:55.529712 4816 scope.go:117] "RemoveContainer" containerID="3b262fd21eee0d19abdc5924d306ef1a5f47e904335e542336dc28173c0a4a65" Mar 16 00:24:55 crc kubenswrapper[4816]: E0316 00:24:55.530247 4816 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3b262fd21eee0d19abdc5924d306ef1a5f47e904335e542336dc28173c0a4a65\": container with ID starting with 3b262fd21eee0d19abdc5924d306ef1a5f47e904335e542336dc28173c0a4a65 not found: ID does not exist" containerID="3b262fd21eee0d19abdc5924d306ef1a5f47e904335e542336dc28173c0a4a65" Mar 16 00:24:55 crc kubenswrapper[4816]: I0316 00:24:55.530279 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3b262fd21eee0d19abdc5924d306ef1a5f47e904335e542336dc28173c0a4a65"} err="failed to get container status \"3b262fd21eee0d19abdc5924d306ef1a5f47e904335e542336dc28173c0a4a65\": rpc error: code = NotFound desc = could not find container \"3b262fd21eee0d19abdc5924d306ef1a5f47e904335e542336dc28173c0a4a65\": container with ID starting with 3b262fd21eee0d19abdc5924d306ef1a5f47e904335e542336dc28173c0a4a65 not found: ID does not exist" Mar 16 00:24:55 crc kubenswrapper[4816]: I0316 00:24:55.530300 4816 scope.go:117] "RemoveContainer" containerID="90565ad1dfc64c2cda6b92e812af328722e858d91cfadac9340c716d273d76ff" Mar 16 00:24:55 crc kubenswrapper[4816]: E0316 00:24:55.530723 4816 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"90565ad1dfc64c2cda6b92e812af328722e858d91cfadac9340c716d273d76ff\": container with ID starting with 90565ad1dfc64c2cda6b92e812af328722e858d91cfadac9340c716d273d76ff not found: ID does not exist" containerID="90565ad1dfc64c2cda6b92e812af328722e858d91cfadac9340c716d273d76ff" Mar 16 00:24:55 crc kubenswrapper[4816]: I0316 00:24:55.530768 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"90565ad1dfc64c2cda6b92e812af328722e858d91cfadac9340c716d273d76ff"} err="failed to get container status \"90565ad1dfc64c2cda6b92e812af328722e858d91cfadac9340c716d273d76ff\": rpc error: code = NotFound desc = could not find container \"90565ad1dfc64c2cda6b92e812af328722e858d91cfadac9340c716d273d76ff\": container with ID starting with 90565ad1dfc64c2cda6b92e812af328722e858d91cfadac9340c716d273d76ff not found: ID does not exist" Mar 16 00:24:55 crc kubenswrapper[4816]: I0316 00:24:55.530804 4816 scope.go:117] "RemoveContainer" containerID="9917112c1913ecdfd392c35ecd560b26665b5ee21c579ae017d9f0c4b26e303d" Mar 16 00:24:55 crc kubenswrapper[4816]: E0316 00:24:55.531098 4816 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9917112c1913ecdfd392c35ecd560b26665b5ee21c579ae017d9f0c4b26e303d\": container with ID starting with 9917112c1913ecdfd392c35ecd560b26665b5ee21c579ae017d9f0c4b26e303d not found: ID does not exist" containerID="9917112c1913ecdfd392c35ecd560b26665b5ee21c579ae017d9f0c4b26e303d" Mar 16 00:24:55 crc kubenswrapper[4816]: I0316 00:24:55.531122 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9917112c1913ecdfd392c35ecd560b26665b5ee21c579ae017d9f0c4b26e303d"} err="failed to get container status \"9917112c1913ecdfd392c35ecd560b26665b5ee21c579ae017d9f0c4b26e303d\": rpc error: code = NotFound desc = could not find container \"9917112c1913ecdfd392c35ecd560b26665b5ee21c579ae017d9f0c4b26e303d\": container with ID starting with 9917112c1913ecdfd392c35ecd560b26665b5ee21c579ae017d9f0c4b26e303d not found: ID does not exist" Mar 16 00:24:55 crc kubenswrapper[4816]: I0316 00:24:55.673667 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a4cadddc-b411-48ca-b4d3-dc7fdf9767dd" path="/var/lib/kubelet/pods/a4cadddc-b411-48ca-b4d3-dc7fdf9767dd/volumes" Mar 16 00:25:01 crc kubenswrapper[4816]: I0316 00:25:01.863001 4816 patch_prober.go:28] interesting pod/machine-config-daemon-jrdcz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 16 00:25:01 crc kubenswrapper[4816]: I0316 00:25:01.863481 4816 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jrdcz" podUID="dd08ece2-7636-4966-973a-e96a34b70b53" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 16 00:25:04 crc kubenswrapper[4816]: I0316 00:25:04.055773 4816 scope.go:117] "RemoveContainer" containerID="b862cec0bd3d63e5c9dfe4071f9f4f3cb758b083bc3f73a5460bc03b5c4debd8" Mar 16 00:25:07 crc kubenswrapper[4816]: I0316 00:25:07.569120 4816 generic.go:334] "Generic (PLEG): container finished" podID="a4629507-876a-405c-891c-5dcd521cf590" containerID="569da32a07b2521faaf5205b0d1082783f2685ea5a711b0eadcc5491dd41185a" exitCode=0 Mar 16 00:25:07 crc kubenswrapper[4816]: I0316 00:25:07.569254 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-2-build" event={"ID":"a4629507-876a-405c-891c-5dcd521cf590","Type":"ContainerDied","Data":"569da32a07b2521faaf5205b0d1082783f2685ea5a711b0eadcc5491dd41185a"} Mar 16 00:25:08 crc kubenswrapper[4816]: I0316 00:25:08.795418 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-operator-2-build" Mar 16 00:25:08 crc kubenswrapper[4816]: I0316 00:25:08.815241 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/a4629507-876a-405c-891c-5dcd521cf590-build-system-configs\") pod \"a4629507-876a-405c-891c-5dcd521cf590\" (UID: \"a4629507-876a-405c-891c-5dcd521cf590\") " Mar 16 00:25:08 crc kubenswrapper[4816]: I0316 00:25:08.815302 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/a4629507-876a-405c-891c-5dcd521cf590-build-ca-bundles\") pod \"a4629507-876a-405c-891c-5dcd521cf590\" (UID: \"a4629507-876a-405c-891c-5dcd521cf590\") " Mar 16 00:25:08 crc kubenswrapper[4816]: I0316 00:25:08.815361 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h7zgw\" (UniqueName: \"kubernetes.io/projected/a4629507-876a-405c-891c-5dcd521cf590-kube-api-access-h7zgw\") pod \"a4629507-876a-405c-891c-5dcd521cf590\" (UID: \"a4629507-876a-405c-891c-5dcd521cf590\") " Mar 16 00:25:08 crc kubenswrapper[4816]: I0316 00:25:08.815383 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/a4629507-876a-405c-891c-5dcd521cf590-buildworkdir\") pod \"a4629507-876a-405c-891c-5dcd521cf590\" (UID: \"a4629507-876a-405c-891c-5dcd521cf590\") " Mar 16 00:25:08 crc kubenswrapper[4816]: I0316 00:25:08.815401 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/a4629507-876a-405c-891c-5dcd521cf590-build-blob-cache\") pod \"a4629507-876a-405c-891c-5dcd521cf590\" (UID: \"a4629507-876a-405c-891c-5dcd521cf590\") " Mar 16 00:25:08 crc kubenswrapper[4816]: I0316 00:25:08.815427 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-fs5z5-push\" (UniqueName: \"kubernetes.io/secret/a4629507-876a-405c-891c-5dcd521cf590-builder-dockercfg-fs5z5-push\") pod \"a4629507-876a-405c-891c-5dcd521cf590\" (UID: \"a4629507-876a-405c-891c-5dcd521cf590\") " Mar 16 00:25:08 crc kubenswrapper[4816]: I0316 00:25:08.815446 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/a4629507-876a-405c-891c-5dcd521cf590-container-storage-run\") pod \"a4629507-876a-405c-891c-5dcd521cf590\" (UID: \"a4629507-876a-405c-891c-5dcd521cf590\") " Mar 16 00:25:08 crc kubenswrapper[4816]: I0316 00:25:08.815472 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/a4629507-876a-405c-891c-5dcd521cf590-node-pullsecrets\") pod \"a4629507-876a-405c-891c-5dcd521cf590\" (UID: \"a4629507-876a-405c-891c-5dcd521cf590\") " Mar 16 00:25:08 crc kubenswrapper[4816]: I0316 00:25:08.815493 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/a4629507-876a-405c-891c-5dcd521cf590-container-storage-root\") pod \"a4629507-876a-405c-891c-5dcd521cf590\" (UID: \"a4629507-876a-405c-891c-5dcd521cf590\") " Mar 16 00:25:08 crc kubenswrapper[4816]: I0316 00:25:08.815526 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-fs5z5-pull\" (UniqueName: \"kubernetes.io/secret/a4629507-876a-405c-891c-5dcd521cf590-builder-dockercfg-fs5z5-pull\") pod \"a4629507-876a-405c-891c-5dcd521cf590\" (UID: \"a4629507-876a-405c-891c-5dcd521cf590\") " Mar 16 00:25:08 crc kubenswrapper[4816]: I0316 00:25:08.815545 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/a4629507-876a-405c-891c-5dcd521cf590-buildcachedir\") pod \"a4629507-876a-405c-891c-5dcd521cf590\" (UID: \"a4629507-876a-405c-891c-5dcd521cf590\") " Mar 16 00:25:08 crc kubenswrapper[4816]: I0316 00:25:08.815594 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/a4629507-876a-405c-891c-5dcd521cf590-build-proxy-ca-bundles\") pod \"a4629507-876a-405c-891c-5dcd521cf590\" (UID: \"a4629507-876a-405c-891c-5dcd521cf590\") " Mar 16 00:25:08 crc kubenswrapper[4816]: I0316 00:25:08.816451 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a4629507-876a-405c-891c-5dcd521cf590-node-pullsecrets" (OuterVolumeSpecName: "node-pullsecrets") pod "a4629507-876a-405c-891c-5dcd521cf590" (UID: "a4629507-876a-405c-891c-5dcd521cf590"). InnerVolumeSpecName "node-pullsecrets". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 16 00:25:08 crc kubenswrapper[4816]: I0316 00:25:08.816496 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a4629507-876a-405c-891c-5dcd521cf590-build-proxy-ca-bundles" (OuterVolumeSpecName: "build-proxy-ca-bundles") pod "a4629507-876a-405c-891c-5dcd521cf590" (UID: "a4629507-876a-405c-891c-5dcd521cf590"). InnerVolumeSpecName "build-proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:25:08 crc kubenswrapper[4816]: I0316 00:25:08.816572 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a4629507-876a-405c-891c-5dcd521cf590-buildcachedir" (OuterVolumeSpecName: "buildcachedir") pod "a4629507-876a-405c-891c-5dcd521cf590" (UID: "a4629507-876a-405c-891c-5dcd521cf590"). InnerVolumeSpecName "buildcachedir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 16 00:25:08 crc kubenswrapper[4816]: I0316 00:25:08.816790 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a4629507-876a-405c-891c-5dcd521cf590-build-system-configs" (OuterVolumeSpecName: "build-system-configs") pod "a4629507-876a-405c-891c-5dcd521cf590" (UID: "a4629507-876a-405c-891c-5dcd521cf590"). InnerVolumeSpecName "build-system-configs". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:25:08 crc kubenswrapper[4816]: I0316 00:25:08.817761 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a4629507-876a-405c-891c-5dcd521cf590-build-ca-bundles" (OuterVolumeSpecName: "build-ca-bundles") pod "a4629507-876a-405c-891c-5dcd521cf590" (UID: "a4629507-876a-405c-891c-5dcd521cf590"). InnerVolumeSpecName "build-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:25:08 crc kubenswrapper[4816]: I0316 00:25:08.819877 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a4629507-876a-405c-891c-5dcd521cf590-container-storage-run" (OuterVolumeSpecName: "container-storage-run") pod "a4629507-876a-405c-891c-5dcd521cf590" (UID: "a4629507-876a-405c-891c-5dcd521cf590"). InnerVolumeSpecName "container-storage-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 16 00:25:08 crc kubenswrapper[4816]: I0316 00:25:08.822178 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a4629507-876a-405c-891c-5dcd521cf590-kube-api-access-h7zgw" (OuterVolumeSpecName: "kube-api-access-h7zgw") pod "a4629507-876a-405c-891c-5dcd521cf590" (UID: "a4629507-876a-405c-891c-5dcd521cf590"). InnerVolumeSpecName "kube-api-access-h7zgw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:25:08 crc kubenswrapper[4816]: I0316 00:25:08.822439 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a4629507-876a-405c-891c-5dcd521cf590-builder-dockercfg-fs5z5-push" (OuterVolumeSpecName: "builder-dockercfg-fs5z5-push") pod "a4629507-876a-405c-891c-5dcd521cf590" (UID: "a4629507-876a-405c-891c-5dcd521cf590"). InnerVolumeSpecName "builder-dockercfg-fs5z5-push". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 16 00:25:08 crc kubenswrapper[4816]: I0316 00:25:08.834956 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a4629507-876a-405c-891c-5dcd521cf590-builder-dockercfg-fs5z5-pull" (OuterVolumeSpecName: "builder-dockercfg-fs5z5-pull") pod "a4629507-876a-405c-891c-5dcd521cf590" (UID: "a4629507-876a-405c-891c-5dcd521cf590"). InnerVolumeSpecName "builder-dockercfg-fs5z5-pull". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 16 00:25:08 crc kubenswrapper[4816]: I0316 00:25:08.869524 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a4629507-876a-405c-891c-5dcd521cf590-buildworkdir" (OuterVolumeSpecName: "buildworkdir") pod "a4629507-876a-405c-891c-5dcd521cf590" (UID: "a4629507-876a-405c-891c-5dcd521cf590"). InnerVolumeSpecName "buildworkdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 16 00:25:08 crc kubenswrapper[4816]: I0316 00:25:08.916400 4816 reconciler_common.go:293] "Volume detached for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/a4629507-876a-405c-891c-5dcd521cf590-build-system-configs\") on node \"crc\" DevicePath \"\"" Mar 16 00:25:08 crc kubenswrapper[4816]: I0316 00:25:08.916437 4816 reconciler_common.go:293] "Volume detached for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/a4629507-876a-405c-891c-5dcd521cf590-build-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 16 00:25:08 crc kubenswrapper[4816]: I0316 00:25:08.916446 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h7zgw\" (UniqueName: \"kubernetes.io/projected/a4629507-876a-405c-891c-5dcd521cf590-kube-api-access-h7zgw\") on node \"crc\" DevicePath \"\"" Mar 16 00:25:08 crc kubenswrapper[4816]: I0316 00:25:08.916454 4816 reconciler_common.go:293] "Volume detached for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/a4629507-876a-405c-891c-5dcd521cf590-buildworkdir\") on node \"crc\" DevicePath \"\"" Mar 16 00:25:08 crc kubenswrapper[4816]: I0316 00:25:08.916463 4816 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-fs5z5-push\" (UniqueName: \"kubernetes.io/secret/a4629507-876a-405c-891c-5dcd521cf590-builder-dockercfg-fs5z5-push\") on node \"crc\" DevicePath \"\"" Mar 16 00:25:08 crc kubenswrapper[4816]: I0316 00:25:08.916472 4816 reconciler_common.go:293] "Volume detached for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/a4629507-876a-405c-891c-5dcd521cf590-container-storage-run\") on node \"crc\" DevicePath \"\"" Mar 16 00:25:08 crc kubenswrapper[4816]: I0316 00:25:08.916479 4816 reconciler_common.go:293] "Volume detached for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/a4629507-876a-405c-891c-5dcd521cf590-node-pullsecrets\") on node \"crc\" DevicePath \"\"" Mar 16 00:25:08 crc kubenswrapper[4816]: I0316 00:25:08.916487 4816 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-fs5z5-pull\" (UniqueName: \"kubernetes.io/secret/a4629507-876a-405c-891c-5dcd521cf590-builder-dockercfg-fs5z5-pull\") on node \"crc\" DevicePath \"\"" Mar 16 00:25:08 crc kubenswrapper[4816]: I0316 00:25:08.916495 4816 reconciler_common.go:293] "Volume detached for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/a4629507-876a-405c-891c-5dcd521cf590-buildcachedir\") on node \"crc\" DevicePath \"\"" Mar 16 00:25:08 crc kubenswrapper[4816]: I0316 00:25:08.916523 4816 reconciler_common.go:293] "Volume detached for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/a4629507-876a-405c-891c-5dcd521cf590-build-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 16 00:25:09 crc kubenswrapper[4816]: I0316 00:25:09.011117 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a4629507-876a-405c-891c-5dcd521cf590-build-blob-cache" (OuterVolumeSpecName: "build-blob-cache") pod "a4629507-876a-405c-891c-5dcd521cf590" (UID: "a4629507-876a-405c-891c-5dcd521cf590"). InnerVolumeSpecName "build-blob-cache". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 16 00:25:09 crc kubenswrapper[4816]: I0316 00:25:09.017608 4816 reconciler_common.go:293] "Volume detached for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/a4629507-876a-405c-891c-5dcd521cf590-build-blob-cache\") on node \"crc\" DevicePath \"\"" Mar 16 00:25:09 crc kubenswrapper[4816]: I0316 00:25:09.585510 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-2-build" event={"ID":"a4629507-876a-405c-891c-5dcd521cf590","Type":"ContainerDied","Data":"afc880cd89cb3d0dd5c36035edaf726279cc4d27b43fc12a6df286ecc563c314"} Mar 16 00:25:09 crc kubenswrapper[4816]: I0316 00:25:09.585644 4816 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="afc880cd89cb3d0dd5c36035edaf726279cc4d27b43fc12a6df286ecc563c314" Mar 16 00:25:09 crc kubenswrapper[4816]: I0316 00:25:09.585980 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-operator-2-build" Mar 16 00:25:10 crc kubenswrapper[4816]: I0316 00:25:10.821833 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a4629507-876a-405c-891c-5dcd521cf590-container-storage-root" (OuterVolumeSpecName: "container-storage-root") pod "a4629507-876a-405c-891c-5dcd521cf590" (UID: "a4629507-876a-405c-891c-5dcd521cf590"). InnerVolumeSpecName "container-storage-root". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 16 00:25:10 crc kubenswrapper[4816]: I0316 00:25:10.840239 4816 reconciler_common.go:293] "Volume detached for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/a4629507-876a-405c-891c-5dcd521cf590-container-storage-root\") on node \"crc\" DevicePath \"\"" Mar 16 00:25:13 crc kubenswrapper[4816]: I0316 00:25:13.935985 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/smart-gateway-operator-1-build"] Mar 16 00:25:13 crc kubenswrapper[4816]: E0316 00:25:13.936239 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a4629507-876a-405c-891c-5dcd521cf590" containerName="manage-dockerfile" Mar 16 00:25:13 crc kubenswrapper[4816]: I0316 00:25:13.936250 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="a4629507-876a-405c-891c-5dcd521cf590" containerName="manage-dockerfile" Mar 16 00:25:13 crc kubenswrapper[4816]: E0316 00:25:13.936261 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a4cadddc-b411-48ca-b4d3-dc7fdf9767dd" containerName="extract-utilities" Mar 16 00:25:13 crc kubenswrapper[4816]: I0316 00:25:13.936268 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="a4cadddc-b411-48ca-b4d3-dc7fdf9767dd" containerName="extract-utilities" Mar 16 00:25:13 crc kubenswrapper[4816]: E0316 00:25:13.936277 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a4cadddc-b411-48ca-b4d3-dc7fdf9767dd" containerName="registry-server" Mar 16 00:25:13 crc kubenswrapper[4816]: I0316 00:25:13.936283 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="a4cadddc-b411-48ca-b4d3-dc7fdf9767dd" containerName="registry-server" Mar 16 00:25:13 crc kubenswrapper[4816]: E0316 00:25:13.936294 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a4629507-876a-405c-891c-5dcd521cf590" containerName="docker-build" Mar 16 00:25:13 crc kubenswrapper[4816]: I0316 00:25:13.936299 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="a4629507-876a-405c-891c-5dcd521cf590" containerName="docker-build" Mar 16 00:25:13 crc kubenswrapper[4816]: E0316 00:25:13.936314 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a4cadddc-b411-48ca-b4d3-dc7fdf9767dd" containerName="extract-content" Mar 16 00:25:13 crc kubenswrapper[4816]: I0316 00:25:13.936320 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="a4cadddc-b411-48ca-b4d3-dc7fdf9767dd" containerName="extract-content" Mar 16 00:25:13 crc kubenswrapper[4816]: E0316 00:25:13.936329 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a4629507-876a-405c-891c-5dcd521cf590" containerName="git-clone" Mar 16 00:25:13 crc kubenswrapper[4816]: I0316 00:25:13.936335 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="a4629507-876a-405c-891c-5dcd521cf590" containerName="git-clone" Mar 16 00:25:13 crc kubenswrapper[4816]: I0316 00:25:13.936443 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="a4629507-876a-405c-891c-5dcd521cf590" containerName="docker-build" Mar 16 00:25:13 crc kubenswrapper[4816]: I0316 00:25:13.936459 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="a4cadddc-b411-48ca-b4d3-dc7fdf9767dd" containerName="registry-server" Mar 16 00:25:13 crc kubenswrapper[4816]: I0316 00:25:13.937144 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/smart-gateway-operator-1-build" Mar 16 00:25:13 crc kubenswrapper[4816]: I0316 00:25:13.941391 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"smart-gateway-operator-1-sys-config" Mar 16 00:25:13 crc kubenswrapper[4816]: I0316 00:25:13.941463 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"smart-gateway-operator-1-ca" Mar 16 00:25:13 crc kubenswrapper[4816]: I0316 00:25:13.941630 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"smart-gateway-operator-1-global-ca" Mar 16 00:25:13 crc kubenswrapper[4816]: I0316 00:25:13.941738 4816 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"builder-dockercfg-fs5z5" Mar 16 00:25:13 crc kubenswrapper[4816]: I0316 00:25:13.949764 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/smart-gateway-operator-1-build"] Mar 16 00:25:14 crc kubenswrapper[4816]: I0316 00:25:14.086213 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/56c73079-20fb-4653-955c-7c540b94c96d-build-ca-bundles\") pod \"smart-gateway-operator-1-build\" (UID: \"56c73079-20fb-4653-955c-7c540b94c96d\") " pod="service-telemetry/smart-gateway-operator-1-build" Mar 16 00:25:14 crc kubenswrapper[4816]: I0316 00:25:14.086526 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/56c73079-20fb-4653-955c-7c540b94c96d-container-storage-run\") pod \"smart-gateway-operator-1-build\" (UID: \"56c73079-20fb-4653-955c-7c540b94c96d\") " pod="service-telemetry/smart-gateway-operator-1-build" Mar 16 00:25:14 crc kubenswrapper[4816]: I0316 00:25:14.086573 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-fs5z5-pull\" (UniqueName: \"kubernetes.io/secret/56c73079-20fb-4653-955c-7c540b94c96d-builder-dockercfg-fs5z5-pull\") pod \"smart-gateway-operator-1-build\" (UID: \"56c73079-20fb-4653-955c-7c540b94c96d\") " pod="service-telemetry/smart-gateway-operator-1-build" Mar 16 00:25:14 crc kubenswrapper[4816]: I0316 00:25:14.086605 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/56c73079-20fb-4653-955c-7c540b94c96d-build-blob-cache\") pod \"smart-gateway-operator-1-build\" (UID: \"56c73079-20fb-4653-955c-7c540b94c96d\") " pod="service-telemetry/smart-gateway-operator-1-build" Mar 16 00:25:14 crc kubenswrapper[4816]: I0316 00:25:14.086629 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-fs5z5-push\" (UniqueName: \"kubernetes.io/secret/56c73079-20fb-4653-955c-7c540b94c96d-builder-dockercfg-fs5z5-push\") pod \"smart-gateway-operator-1-build\" (UID: \"56c73079-20fb-4653-955c-7c540b94c96d\") " pod="service-telemetry/smart-gateway-operator-1-build" Mar 16 00:25:14 crc kubenswrapper[4816]: I0316 00:25:14.086660 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/56c73079-20fb-4653-955c-7c540b94c96d-build-proxy-ca-bundles\") pod \"smart-gateway-operator-1-build\" (UID: \"56c73079-20fb-4653-955c-7c540b94c96d\") " pod="service-telemetry/smart-gateway-operator-1-build" Mar 16 00:25:14 crc kubenswrapper[4816]: I0316 00:25:14.086745 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/56c73079-20fb-4653-955c-7c540b94c96d-build-system-configs\") pod \"smart-gateway-operator-1-build\" (UID: \"56c73079-20fb-4653-955c-7c540b94c96d\") " pod="service-telemetry/smart-gateway-operator-1-build" Mar 16 00:25:14 crc kubenswrapper[4816]: I0316 00:25:14.086769 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/56c73079-20fb-4653-955c-7c540b94c96d-buildcachedir\") pod \"smart-gateway-operator-1-build\" (UID: \"56c73079-20fb-4653-955c-7c540b94c96d\") " pod="service-telemetry/smart-gateway-operator-1-build" Mar 16 00:25:14 crc kubenswrapper[4816]: I0316 00:25:14.086799 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/56c73079-20fb-4653-955c-7c540b94c96d-buildworkdir\") pod \"smart-gateway-operator-1-build\" (UID: \"56c73079-20fb-4653-955c-7c540b94c96d\") " pod="service-telemetry/smart-gateway-operator-1-build" Mar 16 00:25:14 crc kubenswrapper[4816]: I0316 00:25:14.086829 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/56c73079-20fb-4653-955c-7c540b94c96d-node-pullsecrets\") pod \"smart-gateway-operator-1-build\" (UID: \"56c73079-20fb-4653-955c-7c540b94c96d\") " pod="service-telemetry/smart-gateway-operator-1-build" Mar 16 00:25:14 crc kubenswrapper[4816]: I0316 00:25:14.086851 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/56c73079-20fb-4653-955c-7c540b94c96d-container-storage-root\") pod \"smart-gateway-operator-1-build\" (UID: \"56c73079-20fb-4653-955c-7c540b94c96d\") " pod="service-telemetry/smart-gateway-operator-1-build" Mar 16 00:25:14 crc kubenswrapper[4816]: I0316 00:25:14.086889 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xr8b4\" (UniqueName: \"kubernetes.io/projected/56c73079-20fb-4653-955c-7c540b94c96d-kube-api-access-xr8b4\") pod \"smart-gateway-operator-1-build\" (UID: \"56c73079-20fb-4653-955c-7c540b94c96d\") " pod="service-telemetry/smart-gateway-operator-1-build" Mar 16 00:25:14 crc kubenswrapper[4816]: I0316 00:25:14.188272 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/56c73079-20fb-4653-955c-7c540b94c96d-build-blob-cache\") pod \"smart-gateway-operator-1-build\" (UID: \"56c73079-20fb-4653-955c-7c540b94c96d\") " pod="service-telemetry/smart-gateway-operator-1-build" Mar 16 00:25:14 crc kubenswrapper[4816]: I0316 00:25:14.188367 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-fs5z5-push\" (UniqueName: \"kubernetes.io/secret/56c73079-20fb-4653-955c-7c540b94c96d-builder-dockercfg-fs5z5-push\") pod \"smart-gateway-operator-1-build\" (UID: \"56c73079-20fb-4653-955c-7c540b94c96d\") " pod="service-telemetry/smart-gateway-operator-1-build" Mar 16 00:25:14 crc kubenswrapper[4816]: I0316 00:25:14.188456 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/56c73079-20fb-4653-955c-7c540b94c96d-build-proxy-ca-bundles\") pod \"smart-gateway-operator-1-build\" (UID: \"56c73079-20fb-4653-955c-7c540b94c96d\") " pod="service-telemetry/smart-gateway-operator-1-build" Mar 16 00:25:14 crc kubenswrapper[4816]: I0316 00:25:14.188518 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/56c73079-20fb-4653-955c-7c540b94c96d-build-system-configs\") pod \"smart-gateway-operator-1-build\" (UID: \"56c73079-20fb-4653-955c-7c540b94c96d\") " pod="service-telemetry/smart-gateway-operator-1-build" Mar 16 00:25:14 crc kubenswrapper[4816]: I0316 00:25:14.188603 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/56c73079-20fb-4653-955c-7c540b94c96d-buildcachedir\") pod \"smart-gateway-operator-1-build\" (UID: \"56c73079-20fb-4653-955c-7c540b94c96d\") " pod="service-telemetry/smart-gateway-operator-1-build" Mar 16 00:25:14 crc kubenswrapper[4816]: I0316 00:25:14.188662 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/56c73079-20fb-4653-955c-7c540b94c96d-buildworkdir\") pod \"smart-gateway-operator-1-build\" (UID: \"56c73079-20fb-4653-955c-7c540b94c96d\") " pod="service-telemetry/smart-gateway-operator-1-build" Mar 16 00:25:14 crc kubenswrapper[4816]: I0316 00:25:14.188686 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/56c73079-20fb-4653-955c-7c540b94c96d-build-blob-cache\") pod \"smart-gateway-operator-1-build\" (UID: \"56c73079-20fb-4653-955c-7c540b94c96d\") " pod="service-telemetry/smart-gateway-operator-1-build" Mar 16 00:25:14 crc kubenswrapper[4816]: I0316 00:25:14.188725 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/56c73079-20fb-4653-955c-7c540b94c96d-node-pullsecrets\") pod \"smart-gateway-operator-1-build\" (UID: \"56c73079-20fb-4653-955c-7c540b94c96d\") " pod="service-telemetry/smart-gateway-operator-1-build" Mar 16 00:25:14 crc kubenswrapper[4816]: I0316 00:25:14.188774 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/56c73079-20fb-4653-955c-7c540b94c96d-container-storage-root\") pod \"smart-gateway-operator-1-build\" (UID: \"56c73079-20fb-4653-955c-7c540b94c96d\") " pod="service-telemetry/smart-gateway-operator-1-build" Mar 16 00:25:14 crc kubenswrapper[4816]: I0316 00:25:14.188870 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xr8b4\" (UniqueName: \"kubernetes.io/projected/56c73079-20fb-4653-955c-7c540b94c96d-kube-api-access-xr8b4\") pod \"smart-gateway-operator-1-build\" (UID: \"56c73079-20fb-4653-955c-7c540b94c96d\") " pod="service-telemetry/smart-gateway-operator-1-build" Mar 16 00:25:14 crc kubenswrapper[4816]: I0316 00:25:14.188940 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/56c73079-20fb-4653-955c-7c540b94c96d-build-ca-bundles\") pod \"smart-gateway-operator-1-build\" (UID: \"56c73079-20fb-4653-955c-7c540b94c96d\") " pod="service-telemetry/smart-gateway-operator-1-build" Mar 16 00:25:14 crc kubenswrapper[4816]: I0316 00:25:14.188987 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/56c73079-20fb-4653-955c-7c540b94c96d-container-storage-run\") pod \"smart-gateway-operator-1-build\" (UID: \"56c73079-20fb-4653-955c-7c540b94c96d\") " pod="service-telemetry/smart-gateway-operator-1-build" Mar 16 00:25:14 crc kubenswrapper[4816]: I0316 00:25:14.189031 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-fs5z5-pull\" (UniqueName: \"kubernetes.io/secret/56c73079-20fb-4653-955c-7c540b94c96d-builder-dockercfg-fs5z5-pull\") pod \"smart-gateway-operator-1-build\" (UID: \"56c73079-20fb-4653-955c-7c540b94c96d\") " pod="service-telemetry/smart-gateway-operator-1-build" Mar 16 00:25:14 crc kubenswrapper[4816]: I0316 00:25:14.189223 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/56c73079-20fb-4653-955c-7c540b94c96d-build-system-configs\") pod \"smart-gateway-operator-1-build\" (UID: \"56c73079-20fb-4653-955c-7c540b94c96d\") " pod="service-telemetry/smart-gateway-operator-1-build" Mar 16 00:25:14 crc kubenswrapper[4816]: I0316 00:25:14.190700 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/56c73079-20fb-4653-955c-7c540b94c96d-buildcachedir\") pod \"smart-gateway-operator-1-build\" (UID: \"56c73079-20fb-4653-955c-7c540b94c96d\") " pod="service-telemetry/smart-gateway-operator-1-build" Mar 16 00:25:14 crc kubenswrapper[4816]: I0316 00:25:14.190844 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/56c73079-20fb-4653-955c-7c540b94c96d-container-storage-root\") pod \"smart-gateway-operator-1-build\" (UID: \"56c73079-20fb-4653-955c-7c540b94c96d\") " pod="service-telemetry/smart-gateway-operator-1-build" Mar 16 00:25:14 crc kubenswrapper[4816]: I0316 00:25:14.190944 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/56c73079-20fb-4653-955c-7c540b94c96d-node-pullsecrets\") pod \"smart-gateway-operator-1-build\" (UID: \"56c73079-20fb-4653-955c-7c540b94c96d\") " pod="service-telemetry/smart-gateway-operator-1-build" Mar 16 00:25:14 crc kubenswrapper[4816]: I0316 00:25:14.191068 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/56c73079-20fb-4653-955c-7c540b94c96d-build-proxy-ca-bundles\") pod \"smart-gateway-operator-1-build\" (UID: \"56c73079-20fb-4653-955c-7c540b94c96d\") " pod="service-telemetry/smart-gateway-operator-1-build" Mar 16 00:25:14 crc kubenswrapper[4816]: I0316 00:25:14.191065 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/56c73079-20fb-4653-955c-7c540b94c96d-container-storage-run\") pod \"smart-gateway-operator-1-build\" (UID: \"56c73079-20fb-4653-955c-7c540b94c96d\") " pod="service-telemetry/smart-gateway-operator-1-build" Mar 16 00:25:14 crc kubenswrapper[4816]: I0316 00:25:14.191443 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/56c73079-20fb-4653-955c-7c540b94c96d-buildworkdir\") pod \"smart-gateway-operator-1-build\" (UID: \"56c73079-20fb-4653-955c-7c540b94c96d\") " pod="service-telemetry/smart-gateway-operator-1-build" Mar 16 00:25:14 crc kubenswrapper[4816]: I0316 00:25:14.192194 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/56c73079-20fb-4653-955c-7c540b94c96d-build-ca-bundles\") pod \"smart-gateway-operator-1-build\" (UID: \"56c73079-20fb-4653-955c-7c540b94c96d\") " pod="service-telemetry/smart-gateway-operator-1-build" Mar 16 00:25:14 crc kubenswrapper[4816]: I0316 00:25:14.196070 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-fs5z5-push\" (UniqueName: \"kubernetes.io/secret/56c73079-20fb-4653-955c-7c540b94c96d-builder-dockercfg-fs5z5-push\") pod \"smart-gateway-operator-1-build\" (UID: \"56c73079-20fb-4653-955c-7c540b94c96d\") " pod="service-telemetry/smart-gateway-operator-1-build" Mar 16 00:25:14 crc kubenswrapper[4816]: I0316 00:25:14.196717 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-fs5z5-pull\" (UniqueName: \"kubernetes.io/secret/56c73079-20fb-4653-955c-7c540b94c96d-builder-dockercfg-fs5z5-pull\") pod \"smart-gateway-operator-1-build\" (UID: \"56c73079-20fb-4653-955c-7c540b94c96d\") " pod="service-telemetry/smart-gateway-operator-1-build" Mar 16 00:25:14 crc kubenswrapper[4816]: I0316 00:25:14.231708 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xr8b4\" (UniqueName: \"kubernetes.io/projected/56c73079-20fb-4653-955c-7c540b94c96d-kube-api-access-xr8b4\") pod \"smart-gateway-operator-1-build\" (UID: \"56c73079-20fb-4653-955c-7c540b94c96d\") " pod="service-telemetry/smart-gateway-operator-1-build" Mar 16 00:25:14 crc kubenswrapper[4816]: I0316 00:25:14.266699 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/smart-gateway-operator-1-build" Mar 16 00:25:14 crc kubenswrapper[4816]: I0316 00:25:14.727145 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/smart-gateway-operator-1-build"] Mar 16 00:25:15 crc kubenswrapper[4816]: I0316 00:25:15.638765 4816 generic.go:334] "Generic (PLEG): container finished" podID="56c73079-20fb-4653-955c-7c540b94c96d" containerID="8f627bd43002403079a8ab332e67046f4dcc47cb1e30ec5833b02e52d4b44695" exitCode=0 Mar 16 00:25:15 crc kubenswrapper[4816]: I0316 00:25:15.638885 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/smart-gateway-operator-1-build" event={"ID":"56c73079-20fb-4653-955c-7c540b94c96d","Type":"ContainerDied","Data":"8f627bd43002403079a8ab332e67046f4dcc47cb1e30ec5833b02e52d4b44695"} Mar 16 00:25:15 crc kubenswrapper[4816]: I0316 00:25:15.639747 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/smart-gateway-operator-1-build" event={"ID":"56c73079-20fb-4653-955c-7c540b94c96d","Type":"ContainerStarted","Data":"a5dfa12c2bb2e35ed6fbc301ba380573c9717ad79ecb931f8c64e95920bec037"} Mar 16 00:25:16 crc kubenswrapper[4816]: I0316 00:25:16.676928 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/smart-gateway-operator-1-build" event={"ID":"56c73079-20fb-4653-955c-7c540b94c96d","Type":"ContainerStarted","Data":"246df89bdda73900d247502bdf47a55f32fe712e153ba8fcaa5b38620e4cc5b5"} Mar 16 00:25:16 crc kubenswrapper[4816]: I0316 00:25:16.719756 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/smart-gateway-operator-1-build" podStartSLOduration=3.7197272679999998 podStartE2EDuration="3.719727268s" podCreationTimestamp="2026-03-16 00:25:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-16 00:25:16.710184236 +0000 UTC m=+1109.806484259" watchObservedRunningTime="2026-03-16 00:25:16.719727268 +0000 UTC m=+1109.816027261" Mar 16 00:25:24 crc kubenswrapper[4816]: I0316 00:25:24.776934 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["service-telemetry/smart-gateway-operator-1-build"] Mar 16 00:25:24 crc kubenswrapper[4816]: I0316 00:25:24.777831 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="service-telemetry/smart-gateway-operator-1-build" podUID="56c73079-20fb-4653-955c-7c540b94c96d" containerName="docker-build" containerID="cri-o://246df89bdda73900d247502bdf47a55f32fe712e153ba8fcaa5b38620e4cc5b5" gracePeriod=30 Mar 16 00:25:25 crc kubenswrapper[4816]: I0316 00:25:25.154815 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_smart-gateway-operator-1-build_56c73079-20fb-4653-955c-7c540b94c96d/docker-build/0.log" Mar 16 00:25:25 crc kubenswrapper[4816]: I0316 00:25:25.155434 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/smart-gateway-operator-1-build" Mar 16 00:25:25 crc kubenswrapper[4816]: I0316 00:25:25.342489 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/56c73079-20fb-4653-955c-7c540b94c96d-build-proxy-ca-bundles\") pod \"56c73079-20fb-4653-955c-7c540b94c96d\" (UID: \"56c73079-20fb-4653-955c-7c540b94c96d\") " Mar 16 00:25:25 crc kubenswrapper[4816]: I0316 00:25:25.342567 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/56c73079-20fb-4653-955c-7c540b94c96d-build-ca-bundles\") pod \"56c73079-20fb-4653-955c-7c540b94c96d\" (UID: \"56c73079-20fb-4653-955c-7c540b94c96d\") " Mar 16 00:25:25 crc kubenswrapper[4816]: I0316 00:25:25.342611 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-fs5z5-push\" (UniqueName: \"kubernetes.io/secret/56c73079-20fb-4653-955c-7c540b94c96d-builder-dockercfg-fs5z5-push\") pod \"56c73079-20fb-4653-955c-7c540b94c96d\" (UID: \"56c73079-20fb-4653-955c-7c540b94c96d\") " Mar 16 00:25:25 crc kubenswrapper[4816]: I0316 00:25:25.342662 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-fs5z5-pull\" (UniqueName: \"kubernetes.io/secret/56c73079-20fb-4653-955c-7c540b94c96d-builder-dockercfg-fs5z5-pull\") pod \"56c73079-20fb-4653-955c-7c540b94c96d\" (UID: \"56c73079-20fb-4653-955c-7c540b94c96d\") " Mar 16 00:25:25 crc kubenswrapper[4816]: I0316 00:25:25.342697 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/56c73079-20fb-4653-955c-7c540b94c96d-buildworkdir\") pod \"56c73079-20fb-4653-955c-7c540b94c96d\" (UID: \"56c73079-20fb-4653-955c-7c540b94c96d\") " Mar 16 00:25:25 crc kubenswrapper[4816]: I0316 00:25:25.344033 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/56c73079-20fb-4653-955c-7c540b94c96d-container-storage-run\") pod \"56c73079-20fb-4653-955c-7c540b94c96d\" (UID: \"56c73079-20fb-4653-955c-7c540b94c96d\") " Mar 16 00:25:25 crc kubenswrapper[4816]: I0316 00:25:25.344102 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/56c73079-20fb-4653-955c-7c540b94c96d-buildcachedir\") pod \"56c73079-20fb-4653-955c-7c540b94c96d\" (UID: \"56c73079-20fb-4653-955c-7c540b94c96d\") " Mar 16 00:25:25 crc kubenswrapper[4816]: I0316 00:25:25.343409 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/56c73079-20fb-4653-955c-7c540b94c96d-buildworkdir" (OuterVolumeSpecName: "buildworkdir") pod "56c73079-20fb-4653-955c-7c540b94c96d" (UID: "56c73079-20fb-4653-955c-7c540b94c96d"). InnerVolumeSpecName "buildworkdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 16 00:25:25 crc kubenswrapper[4816]: I0316 00:25:25.344132 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/56c73079-20fb-4653-955c-7c540b94c96d-node-pullsecrets\") pod \"56c73079-20fb-4653-955c-7c540b94c96d\" (UID: \"56c73079-20fb-4653-955c-7c540b94c96d\") " Mar 16 00:25:25 crc kubenswrapper[4816]: I0316 00:25:25.343718 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/56c73079-20fb-4653-955c-7c540b94c96d-build-proxy-ca-bundles" (OuterVolumeSpecName: "build-proxy-ca-bundles") pod "56c73079-20fb-4653-955c-7c540b94c96d" (UID: "56c73079-20fb-4653-955c-7c540b94c96d"). InnerVolumeSpecName "build-proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:25:25 crc kubenswrapper[4816]: I0316 00:25:25.343797 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/56c73079-20fb-4653-955c-7c540b94c96d-build-ca-bundles" (OuterVolumeSpecName: "build-ca-bundles") pod "56c73079-20fb-4653-955c-7c540b94c96d" (UID: "56c73079-20fb-4653-955c-7c540b94c96d"). InnerVolumeSpecName "build-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:25:25 crc kubenswrapper[4816]: I0316 00:25:25.344179 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/56c73079-20fb-4653-955c-7c540b94c96d-build-blob-cache\") pod \"56c73079-20fb-4653-955c-7c540b94c96d\" (UID: \"56c73079-20fb-4653-955c-7c540b94c96d\") " Mar 16 00:25:25 crc kubenswrapper[4816]: I0316 00:25:25.344184 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/56c73079-20fb-4653-955c-7c540b94c96d-buildcachedir" (OuterVolumeSpecName: "buildcachedir") pod "56c73079-20fb-4653-955c-7c540b94c96d" (UID: "56c73079-20fb-4653-955c-7c540b94c96d"). InnerVolumeSpecName "buildcachedir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 16 00:25:25 crc kubenswrapper[4816]: I0316 00:25:25.344216 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/56c73079-20fb-4653-955c-7c540b94c96d-build-system-configs\") pod \"56c73079-20fb-4653-955c-7c540b94c96d\" (UID: \"56c73079-20fb-4653-955c-7c540b94c96d\") " Mar 16 00:25:25 crc kubenswrapper[4816]: I0316 00:25:25.344216 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/56c73079-20fb-4653-955c-7c540b94c96d-node-pullsecrets" (OuterVolumeSpecName: "node-pullsecrets") pod "56c73079-20fb-4653-955c-7c540b94c96d" (UID: "56c73079-20fb-4653-955c-7c540b94c96d"). InnerVolumeSpecName "node-pullsecrets". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 16 00:25:25 crc kubenswrapper[4816]: I0316 00:25:25.344357 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xr8b4\" (UniqueName: \"kubernetes.io/projected/56c73079-20fb-4653-955c-7c540b94c96d-kube-api-access-xr8b4\") pod \"56c73079-20fb-4653-955c-7c540b94c96d\" (UID: \"56c73079-20fb-4653-955c-7c540b94c96d\") " Mar 16 00:25:25 crc kubenswrapper[4816]: I0316 00:25:25.344509 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/56c73079-20fb-4653-955c-7c540b94c96d-build-system-configs" (OuterVolumeSpecName: "build-system-configs") pod "56c73079-20fb-4653-955c-7c540b94c96d" (UID: "56c73079-20fb-4653-955c-7c540b94c96d"). InnerVolumeSpecName "build-system-configs". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:25:25 crc kubenswrapper[4816]: I0316 00:25:25.345005 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/56c73079-20fb-4653-955c-7c540b94c96d-container-storage-root\") pod \"56c73079-20fb-4653-955c-7c540b94c96d\" (UID: \"56c73079-20fb-4653-955c-7c540b94c96d\") " Mar 16 00:25:25 crc kubenswrapper[4816]: I0316 00:25:25.345017 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/56c73079-20fb-4653-955c-7c540b94c96d-container-storage-run" (OuterVolumeSpecName: "container-storage-run") pod "56c73079-20fb-4653-955c-7c540b94c96d" (UID: "56c73079-20fb-4653-955c-7c540b94c96d"). InnerVolumeSpecName "container-storage-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 16 00:25:25 crc kubenswrapper[4816]: I0316 00:25:25.345373 4816 reconciler_common.go:293] "Volume detached for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/56c73079-20fb-4653-955c-7c540b94c96d-build-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 16 00:25:25 crc kubenswrapper[4816]: I0316 00:25:25.345434 4816 reconciler_common.go:293] "Volume detached for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/56c73079-20fb-4653-955c-7c540b94c96d-build-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 16 00:25:25 crc kubenswrapper[4816]: I0316 00:25:25.345447 4816 reconciler_common.go:293] "Volume detached for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/56c73079-20fb-4653-955c-7c540b94c96d-buildworkdir\") on node \"crc\" DevicePath \"\"" Mar 16 00:25:25 crc kubenswrapper[4816]: I0316 00:25:25.345460 4816 reconciler_common.go:293] "Volume detached for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/56c73079-20fb-4653-955c-7c540b94c96d-container-storage-run\") on node \"crc\" DevicePath \"\"" Mar 16 00:25:25 crc kubenswrapper[4816]: I0316 00:25:25.345472 4816 reconciler_common.go:293] "Volume detached for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/56c73079-20fb-4653-955c-7c540b94c96d-buildcachedir\") on node \"crc\" DevicePath \"\"" Mar 16 00:25:25 crc kubenswrapper[4816]: I0316 00:25:25.345483 4816 reconciler_common.go:293] "Volume detached for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/56c73079-20fb-4653-955c-7c540b94c96d-node-pullsecrets\") on node \"crc\" DevicePath \"\"" Mar 16 00:25:25 crc kubenswrapper[4816]: I0316 00:25:25.345494 4816 reconciler_common.go:293] "Volume detached for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/56c73079-20fb-4653-955c-7c540b94c96d-build-system-configs\") on node \"crc\" DevicePath \"\"" Mar 16 00:25:25 crc kubenswrapper[4816]: I0316 00:25:25.349684 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/56c73079-20fb-4653-955c-7c540b94c96d-builder-dockercfg-fs5z5-push" (OuterVolumeSpecName: "builder-dockercfg-fs5z5-push") pod "56c73079-20fb-4653-955c-7c540b94c96d" (UID: "56c73079-20fb-4653-955c-7c540b94c96d"). InnerVolumeSpecName "builder-dockercfg-fs5z5-push". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 16 00:25:25 crc kubenswrapper[4816]: I0316 00:25:25.350339 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/56c73079-20fb-4653-955c-7c540b94c96d-kube-api-access-xr8b4" (OuterVolumeSpecName: "kube-api-access-xr8b4") pod "56c73079-20fb-4653-955c-7c540b94c96d" (UID: "56c73079-20fb-4653-955c-7c540b94c96d"). InnerVolumeSpecName "kube-api-access-xr8b4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:25:25 crc kubenswrapper[4816]: I0316 00:25:25.350472 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/56c73079-20fb-4653-955c-7c540b94c96d-builder-dockercfg-fs5z5-pull" (OuterVolumeSpecName: "builder-dockercfg-fs5z5-pull") pod "56c73079-20fb-4653-955c-7c540b94c96d" (UID: "56c73079-20fb-4653-955c-7c540b94c96d"). InnerVolumeSpecName "builder-dockercfg-fs5z5-pull". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 16 00:25:25 crc kubenswrapper[4816]: I0316 00:25:25.447400 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xr8b4\" (UniqueName: \"kubernetes.io/projected/56c73079-20fb-4653-955c-7c540b94c96d-kube-api-access-xr8b4\") on node \"crc\" DevicePath \"\"" Mar 16 00:25:25 crc kubenswrapper[4816]: I0316 00:25:25.447442 4816 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-fs5z5-push\" (UniqueName: \"kubernetes.io/secret/56c73079-20fb-4653-955c-7c540b94c96d-builder-dockercfg-fs5z5-push\") on node \"crc\" DevicePath \"\"" Mar 16 00:25:25 crc kubenswrapper[4816]: I0316 00:25:25.447454 4816 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-fs5z5-pull\" (UniqueName: \"kubernetes.io/secret/56c73079-20fb-4653-955c-7c540b94c96d-builder-dockercfg-fs5z5-pull\") on node \"crc\" DevicePath \"\"" Mar 16 00:25:25 crc kubenswrapper[4816]: I0316 00:25:25.519499 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/56c73079-20fb-4653-955c-7c540b94c96d-build-blob-cache" (OuterVolumeSpecName: "build-blob-cache") pod "56c73079-20fb-4653-955c-7c540b94c96d" (UID: "56c73079-20fb-4653-955c-7c540b94c96d"). InnerVolumeSpecName "build-blob-cache". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 16 00:25:25 crc kubenswrapper[4816]: I0316 00:25:25.548389 4816 reconciler_common.go:293] "Volume detached for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/56c73079-20fb-4653-955c-7c540b94c96d-build-blob-cache\") on node \"crc\" DevicePath \"\"" Mar 16 00:25:25 crc kubenswrapper[4816]: I0316 00:25:25.738464 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_smart-gateway-operator-1-build_56c73079-20fb-4653-955c-7c540b94c96d/docker-build/0.log" Mar 16 00:25:25 crc kubenswrapper[4816]: I0316 00:25:25.739326 4816 generic.go:334] "Generic (PLEG): container finished" podID="56c73079-20fb-4653-955c-7c540b94c96d" containerID="246df89bdda73900d247502bdf47a55f32fe712e153ba8fcaa5b38620e4cc5b5" exitCode=1 Mar 16 00:25:25 crc kubenswrapper[4816]: I0316 00:25:25.739368 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/smart-gateway-operator-1-build" event={"ID":"56c73079-20fb-4653-955c-7c540b94c96d","Type":"ContainerDied","Data":"246df89bdda73900d247502bdf47a55f32fe712e153ba8fcaa5b38620e4cc5b5"} Mar 16 00:25:25 crc kubenswrapper[4816]: I0316 00:25:25.739401 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/smart-gateway-operator-1-build" event={"ID":"56c73079-20fb-4653-955c-7c540b94c96d","Type":"ContainerDied","Data":"a5dfa12c2bb2e35ed6fbc301ba380573c9717ad79ecb931f8c64e95920bec037"} Mar 16 00:25:25 crc kubenswrapper[4816]: I0316 00:25:25.739423 4816 scope.go:117] "RemoveContainer" containerID="246df89bdda73900d247502bdf47a55f32fe712e153ba8fcaa5b38620e4cc5b5" Mar 16 00:25:25 crc kubenswrapper[4816]: I0316 00:25:25.739805 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/smart-gateway-operator-1-build" Mar 16 00:25:25 crc kubenswrapper[4816]: I0316 00:25:25.805910 4816 scope.go:117] "RemoveContainer" containerID="8f627bd43002403079a8ab332e67046f4dcc47cb1e30ec5833b02e52d4b44695" Mar 16 00:25:25 crc kubenswrapper[4816]: I0316 00:25:25.830089 4816 scope.go:117] "RemoveContainer" containerID="246df89bdda73900d247502bdf47a55f32fe712e153ba8fcaa5b38620e4cc5b5" Mar 16 00:25:25 crc kubenswrapper[4816]: E0316 00:25:25.830907 4816 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"246df89bdda73900d247502bdf47a55f32fe712e153ba8fcaa5b38620e4cc5b5\": container with ID starting with 246df89bdda73900d247502bdf47a55f32fe712e153ba8fcaa5b38620e4cc5b5 not found: ID does not exist" containerID="246df89bdda73900d247502bdf47a55f32fe712e153ba8fcaa5b38620e4cc5b5" Mar 16 00:25:25 crc kubenswrapper[4816]: I0316 00:25:25.830943 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"246df89bdda73900d247502bdf47a55f32fe712e153ba8fcaa5b38620e4cc5b5"} err="failed to get container status \"246df89bdda73900d247502bdf47a55f32fe712e153ba8fcaa5b38620e4cc5b5\": rpc error: code = NotFound desc = could not find container \"246df89bdda73900d247502bdf47a55f32fe712e153ba8fcaa5b38620e4cc5b5\": container with ID starting with 246df89bdda73900d247502bdf47a55f32fe712e153ba8fcaa5b38620e4cc5b5 not found: ID does not exist" Mar 16 00:25:25 crc kubenswrapper[4816]: I0316 00:25:25.830994 4816 scope.go:117] "RemoveContainer" containerID="8f627bd43002403079a8ab332e67046f4dcc47cb1e30ec5833b02e52d4b44695" Mar 16 00:25:25 crc kubenswrapper[4816]: E0316 00:25:25.832045 4816 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8f627bd43002403079a8ab332e67046f4dcc47cb1e30ec5833b02e52d4b44695\": container with ID starting with 8f627bd43002403079a8ab332e67046f4dcc47cb1e30ec5833b02e52d4b44695 not found: ID does not exist" containerID="8f627bd43002403079a8ab332e67046f4dcc47cb1e30ec5833b02e52d4b44695" Mar 16 00:25:25 crc kubenswrapper[4816]: I0316 00:25:25.832078 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8f627bd43002403079a8ab332e67046f4dcc47cb1e30ec5833b02e52d4b44695"} err="failed to get container status \"8f627bd43002403079a8ab332e67046f4dcc47cb1e30ec5833b02e52d4b44695\": rpc error: code = NotFound desc = could not find container \"8f627bd43002403079a8ab332e67046f4dcc47cb1e30ec5833b02e52d4b44695\": container with ID starting with 8f627bd43002403079a8ab332e67046f4dcc47cb1e30ec5833b02e52d4b44695 not found: ID does not exist" Mar 16 00:25:25 crc kubenswrapper[4816]: I0316 00:25:25.886883 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/56c73079-20fb-4653-955c-7c540b94c96d-container-storage-root" (OuterVolumeSpecName: "container-storage-root") pod "56c73079-20fb-4653-955c-7c540b94c96d" (UID: "56c73079-20fb-4653-955c-7c540b94c96d"). InnerVolumeSpecName "container-storage-root". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 16 00:25:25 crc kubenswrapper[4816]: I0316 00:25:25.953079 4816 reconciler_common.go:293] "Volume detached for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/56c73079-20fb-4653-955c-7c540b94c96d-container-storage-root\") on node \"crc\" DevicePath \"\"" Mar 16 00:25:26 crc kubenswrapper[4816]: I0316 00:25:26.071415 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["service-telemetry/smart-gateway-operator-1-build"] Mar 16 00:25:26 crc kubenswrapper[4816]: I0316 00:25:26.077783 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["service-telemetry/smart-gateway-operator-1-build"] Mar 16 00:25:26 crc kubenswrapper[4816]: I0316 00:25:26.441855 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/smart-gateway-operator-2-build"] Mar 16 00:25:26 crc kubenswrapper[4816]: E0316 00:25:26.442165 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="56c73079-20fb-4653-955c-7c540b94c96d" containerName="docker-build" Mar 16 00:25:26 crc kubenswrapper[4816]: I0316 00:25:26.442177 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="56c73079-20fb-4653-955c-7c540b94c96d" containerName="docker-build" Mar 16 00:25:26 crc kubenswrapper[4816]: E0316 00:25:26.442196 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="56c73079-20fb-4653-955c-7c540b94c96d" containerName="manage-dockerfile" Mar 16 00:25:26 crc kubenswrapper[4816]: I0316 00:25:26.442203 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="56c73079-20fb-4653-955c-7c540b94c96d" containerName="manage-dockerfile" Mar 16 00:25:26 crc kubenswrapper[4816]: I0316 00:25:26.442514 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="56c73079-20fb-4653-955c-7c540b94c96d" containerName="docker-build" Mar 16 00:25:26 crc kubenswrapper[4816]: I0316 00:25:26.443399 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/smart-gateway-operator-2-build" Mar 16 00:25:26 crc kubenswrapper[4816]: I0316 00:25:26.445821 4816 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"builder-dockercfg-fs5z5" Mar 16 00:25:26 crc kubenswrapper[4816]: I0316 00:25:26.446081 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"smart-gateway-operator-2-sys-config" Mar 16 00:25:26 crc kubenswrapper[4816]: I0316 00:25:26.446781 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"smart-gateway-operator-2-global-ca" Mar 16 00:25:26 crc kubenswrapper[4816]: I0316 00:25:26.446913 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"smart-gateway-operator-2-ca" Mar 16 00:25:26 crc kubenswrapper[4816]: I0316 00:25:26.458138 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d8a56e88-900c-411c-b75c-029cf7bee318-build-proxy-ca-bundles\") pod \"smart-gateway-operator-2-build\" (UID: \"d8a56e88-900c-411c-b75c-029cf7bee318\") " pod="service-telemetry/smart-gateway-operator-2-build" Mar 16 00:25:26 crc kubenswrapper[4816]: I0316 00:25:26.458187 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/d8a56e88-900c-411c-b75c-029cf7bee318-container-storage-root\") pod \"smart-gateway-operator-2-build\" (UID: \"d8a56e88-900c-411c-b75c-029cf7bee318\") " pod="service-telemetry/smart-gateway-operator-2-build" Mar 16 00:25:26 crc kubenswrapper[4816]: I0316 00:25:26.458258 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-fs5z5-pull\" (UniqueName: \"kubernetes.io/secret/d8a56e88-900c-411c-b75c-029cf7bee318-builder-dockercfg-fs5z5-pull\") pod \"smart-gateway-operator-2-build\" (UID: \"d8a56e88-900c-411c-b75c-029cf7bee318\") " pod="service-telemetry/smart-gateway-operator-2-build" Mar 16 00:25:26 crc kubenswrapper[4816]: I0316 00:25:26.458292 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-fs5z5-push\" (UniqueName: \"kubernetes.io/secret/d8a56e88-900c-411c-b75c-029cf7bee318-builder-dockercfg-fs5z5-push\") pod \"smart-gateway-operator-2-build\" (UID: \"d8a56e88-900c-411c-b75c-029cf7bee318\") " pod="service-telemetry/smart-gateway-operator-2-build" Mar 16 00:25:26 crc kubenswrapper[4816]: I0316 00:25:26.458311 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/d8a56e88-900c-411c-b75c-029cf7bee318-build-system-configs\") pod \"smart-gateway-operator-2-build\" (UID: \"d8a56e88-900c-411c-b75c-029cf7bee318\") " pod="service-telemetry/smart-gateway-operator-2-build" Mar 16 00:25:26 crc kubenswrapper[4816]: I0316 00:25:26.458333 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d8a56e88-900c-411c-b75c-029cf7bee318-build-ca-bundles\") pod \"smart-gateway-operator-2-build\" (UID: \"d8a56e88-900c-411c-b75c-029cf7bee318\") " pod="service-telemetry/smart-gateway-operator-2-build" Mar 16 00:25:26 crc kubenswrapper[4816]: I0316 00:25:26.458355 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/d8a56e88-900c-411c-b75c-029cf7bee318-container-storage-run\") pod \"smart-gateway-operator-2-build\" (UID: \"d8a56e88-900c-411c-b75c-029cf7bee318\") " pod="service-telemetry/smart-gateway-operator-2-build" Mar 16 00:25:26 crc kubenswrapper[4816]: I0316 00:25:26.458658 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/d8a56e88-900c-411c-b75c-029cf7bee318-build-blob-cache\") pod \"smart-gateway-operator-2-build\" (UID: \"d8a56e88-900c-411c-b75c-029cf7bee318\") " pod="service-telemetry/smart-gateway-operator-2-build" Mar 16 00:25:26 crc kubenswrapper[4816]: I0316 00:25:26.458902 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/d8a56e88-900c-411c-b75c-029cf7bee318-buildworkdir\") pod \"smart-gateway-operator-2-build\" (UID: \"d8a56e88-900c-411c-b75c-029cf7bee318\") " pod="service-telemetry/smart-gateway-operator-2-build" Mar 16 00:25:26 crc kubenswrapper[4816]: I0316 00:25:26.459092 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/d8a56e88-900c-411c-b75c-029cf7bee318-node-pullsecrets\") pod \"smart-gateway-operator-2-build\" (UID: \"d8a56e88-900c-411c-b75c-029cf7bee318\") " pod="service-telemetry/smart-gateway-operator-2-build" Mar 16 00:25:26 crc kubenswrapper[4816]: I0316 00:25:26.459180 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/d8a56e88-900c-411c-b75c-029cf7bee318-buildcachedir\") pod \"smart-gateway-operator-2-build\" (UID: \"d8a56e88-900c-411c-b75c-029cf7bee318\") " pod="service-telemetry/smart-gateway-operator-2-build" Mar 16 00:25:26 crc kubenswrapper[4816]: I0316 00:25:26.459236 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xshg5\" (UniqueName: \"kubernetes.io/projected/d8a56e88-900c-411c-b75c-029cf7bee318-kube-api-access-xshg5\") pod \"smart-gateway-operator-2-build\" (UID: \"d8a56e88-900c-411c-b75c-029cf7bee318\") " pod="service-telemetry/smart-gateway-operator-2-build" Mar 16 00:25:26 crc kubenswrapper[4816]: I0316 00:25:26.468703 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/smart-gateway-operator-2-build"] Mar 16 00:25:26 crc kubenswrapper[4816]: I0316 00:25:26.559565 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/d8a56e88-900c-411c-b75c-029cf7bee318-buildcachedir\") pod \"smart-gateway-operator-2-build\" (UID: \"d8a56e88-900c-411c-b75c-029cf7bee318\") " pod="service-telemetry/smart-gateway-operator-2-build" Mar 16 00:25:26 crc kubenswrapper[4816]: I0316 00:25:26.559613 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xshg5\" (UniqueName: \"kubernetes.io/projected/d8a56e88-900c-411c-b75c-029cf7bee318-kube-api-access-xshg5\") pod \"smart-gateway-operator-2-build\" (UID: \"d8a56e88-900c-411c-b75c-029cf7bee318\") " pod="service-telemetry/smart-gateway-operator-2-build" Mar 16 00:25:26 crc kubenswrapper[4816]: I0316 00:25:26.559635 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d8a56e88-900c-411c-b75c-029cf7bee318-build-proxy-ca-bundles\") pod \"smart-gateway-operator-2-build\" (UID: \"d8a56e88-900c-411c-b75c-029cf7bee318\") " pod="service-telemetry/smart-gateway-operator-2-build" Mar 16 00:25:26 crc kubenswrapper[4816]: I0316 00:25:26.559655 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/d8a56e88-900c-411c-b75c-029cf7bee318-container-storage-root\") pod \"smart-gateway-operator-2-build\" (UID: \"d8a56e88-900c-411c-b75c-029cf7bee318\") " pod="service-telemetry/smart-gateway-operator-2-build" Mar 16 00:25:26 crc kubenswrapper[4816]: I0316 00:25:26.559676 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-fs5z5-pull\" (UniqueName: \"kubernetes.io/secret/d8a56e88-900c-411c-b75c-029cf7bee318-builder-dockercfg-fs5z5-pull\") pod \"smart-gateway-operator-2-build\" (UID: \"d8a56e88-900c-411c-b75c-029cf7bee318\") " pod="service-telemetry/smart-gateway-operator-2-build" Mar 16 00:25:26 crc kubenswrapper[4816]: I0316 00:25:26.559683 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/d8a56e88-900c-411c-b75c-029cf7bee318-buildcachedir\") pod \"smart-gateway-operator-2-build\" (UID: \"d8a56e88-900c-411c-b75c-029cf7bee318\") " pod="service-telemetry/smart-gateway-operator-2-build" Mar 16 00:25:26 crc kubenswrapper[4816]: I0316 00:25:26.559697 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-fs5z5-push\" (UniqueName: \"kubernetes.io/secret/d8a56e88-900c-411c-b75c-029cf7bee318-builder-dockercfg-fs5z5-push\") pod \"smart-gateway-operator-2-build\" (UID: \"d8a56e88-900c-411c-b75c-029cf7bee318\") " pod="service-telemetry/smart-gateway-operator-2-build" Mar 16 00:25:26 crc kubenswrapper[4816]: I0316 00:25:26.559814 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/d8a56e88-900c-411c-b75c-029cf7bee318-build-system-configs\") pod \"smart-gateway-operator-2-build\" (UID: \"d8a56e88-900c-411c-b75c-029cf7bee318\") " pod="service-telemetry/smart-gateway-operator-2-build" Mar 16 00:25:26 crc kubenswrapper[4816]: I0316 00:25:26.559841 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d8a56e88-900c-411c-b75c-029cf7bee318-build-ca-bundles\") pod \"smart-gateway-operator-2-build\" (UID: \"d8a56e88-900c-411c-b75c-029cf7bee318\") " pod="service-telemetry/smart-gateway-operator-2-build" Mar 16 00:25:26 crc kubenswrapper[4816]: I0316 00:25:26.559876 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/d8a56e88-900c-411c-b75c-029cf7bee318-build-blob-cache\") pod \"smart-gateway-operator-2-build\" (UID: \"d8a56e88-900c-411c-b75c-029cf7bee318\") " pod="service-telemetry/smart-gateway-operator-2-build" Mar 16 00:25:26 crc kubenswrapper[4816]: I0316 00:25:26.559900 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/d8a56e88-900c-411c-b75c-029cf7bee318-container-storage-run\") pod \"smart-gateway-operator-2-build\" (UID: \"d8a56e88-900c-411c-b75c-029cf7bee318\") " pod="service-telemetry/smart-gateway-operator-2-build" Mar 16 00:25:26 crc kubenswrapper[4816]: I0316 00:25:26.559929 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/d8a56e88-900c-411c-b75c-029cf7bee318-buildworkdir\") pod \"smart-gateway-operator-2-build\" (UID: \"d8a56e88-900c-411c-b75c-029cf7bee318\") " pod="service-telemetry/smart-gateway-operator-2-build" Mar 16 00:25:26 crc kubenswrapper[4816]: I0316 00:25:26.560053 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/d8a56e88-900c-411c-b75c-029cf7bee318-node-pullsecrets\") pod \"smart-gateway-operator-2-build\" (UID: \"d8a56e88-900c-411c-b75c-029cf7bee318\") " pod="service-telemetry/smart-gateway-operator-2-build" Mar 16 00:25:26 crc kubenswrapper[4816]: I0316 00:25:26.560170 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/d8a56e88-900c-411c-b75c-029cf7bee318-node-pullsecrets\") pod \"smart-gateway-operator-2-build\" (UID: \"d8a56e88-900c-411c-b75c-029cf7bee318\") " pod="service-telemetry/smart-gateway-operator-2-build" Mar 16 00:25:26 crc kubenswrapper[4816]: I0316 00:25:26.560505 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d8a56e88-900c-411c-b75c-029cf7bee318-build-proxy-ca-bundles\") pod \"smart-gateway-operator-2-build\" (UID: \"d8a56e88-900c-411c-b75c-029cf7bee318\") " pod="service-telemetry/smart-gateway-operator-2-build" Mar 16 00:25:26 crc kubenswrapper[4816]: I0316 00:25:26.560583 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/d8a56e88-900c-411c-b75c-029cf7bee318-build-system-configs\") pod \"smart-gateway-operator-2-build\" (UID: \"d8a56e88-900c-411c-b75c-029cf7bee318\") " pod="service-telemetry/smart-gateway-operator-2-build" Mar 16 00:25:26 crc kubenswrapper[4816]: I0316 00:25:26.560773 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/d8a56e88-900c-411c-b75c-029cf7bee318-build-blob-cache\") pod \"smart-gateway-operator-2-build\" (UID: \"d8a56e88-900c-411c-b75c-029cf7bee318\") " pod="service-telemetry/smart-gateway-operator-2-build" Mar 16 00:25:26 crc kubenswrapper[4816]: I0316 00:25:26.560840 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/d8a56e88-900c-411c-b75c-029cf7bee318-buildworkdir\") pod \"smart-gateway-operator-2-build\" (UID: \"d8a56e88-900c-411c-b75c-029cf7bee318\") " pod="service-telemetry/smart-gateway-operator-2-build" Mar 16 00:25:26 crc kubenswrapper[4816]: I0316 00:25:26.560967 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/d8a56e88-900c-411c-b75c-029cf7bee318-container-storage-run\") pod \"smart-gateway-operator-2-build\" (UID: \"d8a56e88-900c-411c-b75c-029cf7bee318\") " pod="service-telemetry/smart-gateway-operator-2-build" Mar 16 00:25:26 crc kubenswrapper[4816]: I0316 00:25:26.561141 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/d8a56e88-900c-411c-b75c-029cf7bee318-container-storage-root\") pod \"smart-gateway-operator-2-build\" (UID: \"d8a56e88-900c-411c-b75c-029cf7bee318\") " pod="service-telemetry/smart-gateway-operator-2-build" Mar 16 00:25:26 crc kubenswrapper[4816]: I0316 00:25:26.561233 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d8a56e88-900c-411c-b75c-029cf7bee318-build-ca-bundles\") pod \"smart-gateway-operator-2-build\" (UID: \"d8a56e88-900c-411c-b75c-029cf7bee318\") " pod="service-telemetry/smart-gateway-operator-2-build" Mar 16 00:25:26 crc kubenswrapper[4816]: I0316 00:25:26.564176 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-fs5z5-push\" (UniqueName: \"kubernetes.io/secret/d8a56e88-900c-411c-b75c-029cf7bee318-builder-dockercfg-fs5z5-push\") pod \"smart-gateway-operator-2-build\" (UID: \"d8a56e88-900c-411c-b75c-029cf7bee318\") " pod="service-telemetry/smart-gateway-operator-2-build" Mar 16 00:25:26 crc kubenswrapper[4816]: I0316 00:25:26.564723 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-fs5z5-pull\" (UniqueName: \"kubernetes.io/secret/d8a56e88-900c-411c-b75c-029cf7bee318-builder-dockercfg-fs5z5-pull\") pod \"smart-gateway-operator-2-build\" (UID: \"d8a56e88-900c-411c-b75c-029cf7bee318\") " pod="service-telemetry/smart-gateway-operator-2-build" Mar 16 00:25:26 crc kubenswrapper[4816]: I0316 00:25:26.580885 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xshg5\" (UniqueName: \"kubernetes.io/projected/d8a56e88-900c-411c-b75c-029cf7bee318-kube-api-access-xshg5\") pod \"smart-gateway-operator-2-build\" (UID: \"d8a56e88-900c-411c-b75c-029cf7bee318\") " pod="service-telemetry/smart-gateway-operator-2-build" Mar 16 00:25:26 crc kubenswrapper[4816]: I0316 00:25:26.764380 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/smart-gateway-operator-2-build" Mar 16 00:25:26 crc kubenswrapper[4816]: I0316 00:25:26.947789 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/smart-gateway-operator-2-build"] Mar 16 00:25:27 crc kubenswrapper[4816]: I0316 00:25:27.674778 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="56c73079-20fb-4653-955c-7c540b94c96d" path="/var/lib/kubelet/pods/56c73079-20fb-4653-955c-7c540b94c96d/volumes" Mar 16 00:25:27 crc kubenswrapper[4816]: I0316 00:25:27.755632 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/smart-gateway-operator-2-build" event={"ID":"d8a56e88-900c-411c-b75c-029cf7bee318","Type":"ContainerStarted","Data":"b5940147a61702adb2ae65b7102d9d568ce8f3720835c6af6b9846b8c1561cc9"} Mar 16 00:25:27 crc kubenswrapper[4816]: I0316 00:25:27.755694 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/smart-gateway-operator-2-build" event={"ID":"d8a56e88-900c-411c-b75c-029cf7bee318","Type":"ContainerStarted","Data":"9efbca1d81be5f3f1d30dc2767d52d0c0262c6327e71c0501df25c2d4f40ebae"} Mar 16 00:25:28 crc kubenswrapper[4816]: I0316 00:25:28.763287 4816 generic.go:334] "Generic (PLEG): container finished" podID="d8a56e88-900c-411c-b75c-029cf7bee318" containerID="b5940147a61702adb2ae65b7102d9d568ce8f3720835c6af6b9846b8c1561cc9" exitCode=0 Mar 16 00:25:28 crc kubenswrapper[4816]: I0316 00:25:28.763365 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/smart-gateway-operator-2-build" event={"ID":"d8a56e88-900c-411c-b75c-029cf7bee318","Type":"ContainerDied","Data":"b5940147a61702adb2ae65b7102d9d568ce8f3720835c6af6b9846b8c1561cc9"} Mar 16 00:25:29 crc kubenswrapper[4816]: I0316 00:25:29.769169 4816 generic.go:334] "Generic (PLEG): container finished" podID="d8a56e88-900c-411c-b75c-029cf7bee318" containerID="37e4d06bccc38610b9b68c0996692bfadd6818d8d718c0c8b34e5fe1827d612c" exitCode=0 Mar 16 00:25:29 crc kubenswrapper[4816]: I0316 00:25:29.769402 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/smart-gateway-operator-2-build" event={"ID":"d8a56e88-900c-411c-b75c-029cf7bee318","Type":"ContainerDied","Data":"37e4d06bccc38610b9b68c0996692bfadd6818d8d718c0c8b34e5fe1827d612c"} Mar 16 00:25:29 crc kubenswrapper[4816]: I0316 00:25:29.803756 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_smart-gateway-operator-2-build_d8a56e88-900c-411c-b75c-029cf7bee318/manage-dockerfile/0.log" Mar 16 00:25:30 crc kubenswrapper[4816]: I0316 00:25:30.777225 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/smart-gateway-operator-2-build" event={"ID":"d8a56e88-900c-411c-b75c-029cf7bee318","Type":"ContainerStarted","Data":"520476cf1fdde0dec794a382c434e78783cc47141e612b047553998dee82f825"} Mar 16 00:25:30 crc kubenswrapper[4816]: I0316 00:25:30.804456 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/smart-gateway-operator-2-build" podStartSLOduration=4.804436397 podStartE2EDuration="4.804436397s" podCreationTimestamp="2026-03-16 00:25:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-16 00:25:30.802970527 +0000 UTC m=+1123.899270480" watchObservedRunningTime="2026-03-16 00:25:30.804436397 +0000 UTC m=+1123.900736360" Mar 16 00:25:31 crc kubenswrapper[4816]: I0316 00:25:31.862944 4816 patch_prober.go:28] interesting pod/machine-config-daemon-jrdcz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 16 00:25:31 crc kubenswrapper[4816]: I0316 00:25:31.863010 4816 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jrdcz" podUID="dd08ece2-7636-4966-973a-e96a34b70b53" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 16 00:26:00 crc kubenswrapper[4816]: I0316 00:26:00.146121 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29560346-hjpvk"] Mar 16 00:26:00 crc kubenswrapper[4816]: I0316 00:26:00.147449 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29560346-hjpvk" Mar 16 00:26:00 crc kubenswrapper[4816]: I0316 00:26:00.150322 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 16 00:26:00 crc kubenswrapper[4816]: I0316 00:26:00.150534 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-8hc2r" Mar 16 00:26:00 crc kubenswrapper[4816]: I0316 00:26:00.150686 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 16 00:26:00 crc kubenswrapper[4816]: I0316 00:26:00.157630 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29560346-hjpvk"] Mar 16 00:26:00 crc kubenswrapper[4816]: I0316 00:26:00.306623 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k9h62\" (UniqueName: \"kubernetes.io/projected/2942e78f-05b7-486f-bee0-93a942f80d8a-kube-api-access-k9h62\") pod \"auto-csr-approver-29560346-hjpvk\" (UID: \"2942e78f-05b7-486f-bee0-93a942f80d8a\") " pod="openshift-infra/auto-csr-approver-29560346-hjpvk" Mar 16 00:26:00 crc kubenswrapper[4816]: I0316 00:26:00.408410 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k9h62\" (UniqueName: \"kubernetes.io/projected/2942e78f-05b7-486f-bee0-93a942f80d8a-kube-api-access-k9h62\") pod \"auto-csr-approver-29560346-hjpvk\" (UID: \"2942e78f-05b7-486f-bee0-93a942f80d8a\") " pod="openshift-infra/auto-csr-approver-29560346-hjpvk" Mar 16 00:26:00 crc kubenswrapper[4816]: I0316 00:26:00.433506 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k9h62\" (UniqueName: \"kubernetes.io/projected/2942e78f-05b7-486f-bee0-93a942f80d8a-kube-api-access-k9h62\") pod \"auto-csr-approver-29560346-hjpvk\" (UID: \"2942e78f-05b7-486f-bee0-93a942f80d8a\") " pod="openshift-infra/auto-csr-approver-29560346-hjpvk" Mar 16 00:26:00 crc kubenswrapper[4816]: I0316 00:26:00.466348 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29560346-hjpvk" Mar 16 00:26:00 crc kubenswrapper[4816]: I0316 00:26:00.701855 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29560346-hjpvk"] Mar 16 00:26:00 crc kubenswrapper[4816]: I0316 00:26:00.993597 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29560346-hjpvk" event={"ID":"2942e78f-05b7-486f-bee0-93a942f80d8a","Type":"ContainerStarted","Data":"77b5743e696807d6ed2a02e91087afa64269c8810eed60a3b3323d5b46c5c105"} Mar 16 00:26:01 crc kubenswrapper[4816]: I0316 00:26:01.863585 4816 patch_prober.go:28] interesting pod/machine-config-daemon-jrdcz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 16 00:26:01 crc kubenswrapper[4816]: I0316 00:26:01.863635 4816 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jrdcz" podUID="dd08ece2-7636-4966-973a-e96a34b70b53" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 16 00:26:01 crc kubenswrapper[4816]: I0316 00:26:01.863673 4816 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-jrdcz" Mar 16 00:26:01 crc kubenswrapper[4816]: I0316 00:26:01.864395 4816 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"d963d56deb174bcc1b2f530e646e1a1dbd328868a82631422f67c019c313cf52"} pod="openshift-machine-config-operator/machine-config-daemon-jrdcz" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 16 00:26:01 crc kubenswrapper[4816]: I0316 00:26:01.864448 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-jrdcz" podUID="dd08ece2-7636-4966-973a-e96a34b70b53" containerName="machine-config-daemon" containerID="cri-o://d963d56deb174bcc1b2f530e646e1a1dbd328868a82631422f67c019c313cf52" gracePeriod=600 Mar 16 00:26:02 crc kubenswrapper[4816]: I0316 00:26:02.004496 4816 generic.go:334] "Generic (PLEG): container finished" podID="dd08ece2-7636-4966-973a-e96a34b70b53" containerID="d963d56deb174bcc1b2f530e646e1a1dbd328868a82631422f67c019c313cf52" exitCode=0 Mar 16 00:26:02 crc kubenswrapper[4816]: I0316 00:26:02.004594 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-jrdcz" event={"ID":"dd08ece2-7636-4966-973a-e96a34b70b53","Type":"ContainerDied","Data":"d963d56deb174bcc1b2f530e646e1a1dbd328868a82631422f67c019c313cf52"} Mar 16 00:26:02 crc kubenswrapper[4816]: I0316 00:26:02.004696 4816 scope.go:117] "RemoveContainer" containerID="d940a23c182654ea98c304045d406af01d62b828901045324158f53e5e4988ad" Mar 16 00:26:03 crc kubenswrapper[4816]: I0316 00:26:03.011507 4816 generic.go:334] "Generic (PLEG): container finished" podID="2942e78f-05b7-486f-bee0-93a942f80d8a" containerID="0b8b2a24c4f32aff091a974cc84de6242e724aacb4bfa1cc19578627d86a25d5" exitCode=0 Mar 16 00:26:03 crc kubenswrapper[4816]: I0316 00:26:03.011579 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29560346-hjpvk" event={"ID":"2942e78f-05b7-486f-bee0-93a942f80d8a","Type":"ContainerDied","Data":"0b8b2a24c4f32aff091a974cc84de6242e724aacb4bfa1cc19578627d86a25d5"} Mar 16 00:26:03 crc kubenswrapper[4816]: I0316 00:26:03.013802 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-jrdcz" event={"ID":"dd08ece2-7636-4966-973a-e96a34b70b53","Type":"ContainerStarted","Data":"4dae7771bcc5c45d3db6bc1014246492c003743ca85668bad7e04528051cc6bc"} Mar 16 00:26:04 crc kubenswrapper[4816]: I0316 00:26:04.396837 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29560346-hjpvk" Mar 16 00:26:04 crc kubenswrapper[4816]: I0316 00:26:04.462973 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k9h62\" (UniqueName: \"kubernetes.io/projected/2942e78f-05b7-486f-bee0-93a942f80d8a-kube-api-access-k9h62\") pod \"2942e78f-05b7-486f-bee0-93a942f80d8a\" (UID: \"2942e78f-05b7-486f-bee0-93a942f80d8a\") " Mar 16 00:26:04 crc kubenswrapper[4816]: I0316 00:26:04.468718 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2942e78f-05b7-486f-bee0-93a942f80d8a-kube-api-access-k9h62" (OuterVolumeSpecName: "kube-api-access-k9h62") pod "2942e78f-05b7-486f-bee0-93a942f80d8a" (UID: "2942e78f-05b7-486f-bee0-93a942f80d8a"). InnerVolumeSpecName "kube-api-access-k9h62". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:26:04 crc kubenswrapper[4816]: I0316 00:26:04.563824 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k9h62\" (UniqueName: \"kubernetes.io/projected/2942e78f-05b7-486f-bee0-93a942f80d8a-kube-api-access-k9h62\") on node \"crc\" DevicePath \"\"" Mar 16 00:26:05 crc kubenswrapper[4816]: I0316 00:26:05.027540 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29560346-hjpvk" event={"ID":"2942e78f-05b7-486f-bee0-93a942f80d8a","Type":"ContainerDied","Data":"77b5743e696807d6ed2a02e91087afa64269c8810eed60a3b3323d5b46c5c105"} Mar 16 00:26:05 crc kubenswrapper[4816]: I0316 00:26:05.027604 4816 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="77b5743e696807d6ed2a02e91087afa64269c8810eed60a3b3323d5b46c5c105" Mar 16 00:26:05 crc kubenswrapper[4816]: I0316 00:26:05.027662 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29560346-hjpvk" Mar 16 00:26:05 crc kubenswrapper[4816]: I0316 00:26:05.450070 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29560340-pmlmw"] Mar 16 00:26:05 crc kubenswrapper[4816]: I0316 00:26:05.456875 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29560340-pmlmw"] Mar 16 00:26:05 crc kubenswrapper[4816]: I0316 00:26:05.674694 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dc958138-2767-4d7a-8f61-bd16b899189f" path="/var/lib/kubelet/pods/dc958138-2767-4d7a-8f61-bd16b899189f/volumes" Mar 16 00:26:33 crc kubenswrapper[4816]: I0316 00:26:33.233913 4816 generic.go:334] "Generic (PLEG): container finished" podID="d8a56e88-900c-411c-b75c-029cf7bee318" containerID="520476cf1fdde0dec794a382c434e78783cc47141e612b047553998dee82f825" exitCode=0 Mar 16 00:26:33 crc kubenswrapper[4816]: I0316 00:26:33.234000 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/smart-gateway-operator-2-build" event={"ID":"d8a56e88-900c-411c-b75c-029cf7bee318","Type":"ContainerDied","Data":"520476cf1fdde0dec794a382c434e78783cc47141e612b047553998dee82f825"} Mar 16 00:26:34 crc kubenswrapper[4816]: I0316 00:26:34.511383 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/smart-gateway-operator-2-build" Mar 16 00:26:34 crc kubenswrapper[4816]: I0316 00:26:34.515079 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/d8a56e88-900c-411c-b75c-029cf7bee318-build-blob-cache\") pod \"d8a56e88-900c-411c-b75c-029cf7bee318\" (UID: \"d8a56e88-900c-411c-b75c-029cf7bee318\") " Mar 16 00:26:34 crc kubenswrapper[4816]: I0316 00:26:34.515226 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/d8a56e88-900c-411c-b75c-029cf7bee318-buildcachedir\") pod \"d8a56e88-900c-411c-b75c-029cf7bee318\" (UID: \"d8a56e88-900c-411c-b75c-029cf7bee318\") " Mar 16 00:26:34 crc kubenswrapper[4816]: I0316 00:26:34.515286 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/d8a56e88-900c-411c-b75c-029cf7bee318-container-storage-run\") pod \"d8a56e88-900c-411c-b75c-029cf7bee318\" (UID: \"d8a56e88-900c-411c-b75c-029cf7bee318\") " Mar 16 00:26:34 crc kubenswrapper[4816]: I0316 00:26:34.515316 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d8a56e88-900c-411c-b75c-029cf7bee318-buildcachedir" (OuterVolumeSpecName: "buildcachedir") pod "d8a56e88-900c-411c-b75c-029cf7bee318" (UID: "d8a56e88-900c-411c-b75c-029cf7bee318"). InnerVolumeSpecName "buildcachedir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 16 00:26:34 crc kubenswrapper[4816]: I0316 00:26:34.515323 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-fs5z5-pull\" (UniqueName: \"kubernetes.io/secret/d8a56e88-900c-411c-b75c-029cf7bee318-builder-dockercfg-fs5z5-pull\") pod \"d8a56e88-900c-411c-b75c-029cf7bee318\" (UID: \"d8a56e88-900c-411c-b75c-029cf7bee318\") " Mar 16 00:26:34 crc kubenswrapper[4816]: I0316 00:26:34.515604 4816 reconciler_common.go:293] "Volume detached for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/d8a56e88-900c-411c-b75c-029cf7bee318-buildcachedir\") on node \"crc\" DevicePath \"\"" Mar 16 00:26:34 crc kubenswrapper[4816]: I0316 00:26:34.516232 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d8a56e88-900c-411c-b75c-029cf7bee318-container-storage-run" (OuterVolumeSpecName: "container-storage-run") pod "d8a56e88-900c-411c-b75c-029cf7bee318" (UID: "d8a56e88-900c-411c-b75c-029cf7bee318"). InnerVolumeSpecName "container-storage-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 16 00:26:34 crc kubenswrapper[4816]: I0316 00:26:34.526640 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d8a56e88-900c-411c-b75c-029cf7bee318-builder-dockercfg-fs5z5-pull" (OuterVolumeSpecName: "builder-dockercfg-fs5z5-pull") pod "d8a56e88-900c-411c-b75c-029cf7bee318" (UID: "d8a56e88-900c-411c-b75c-029cf7bee318"). InnerVolumeSpecName "builder-dockercfg-fs5z5-pull". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 16 00:26:34 crc kubenswrapper[4816]: I0316 00:26:34.616220 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/d8a56e88-900c-411c-b75c-029cf7bee318-buildworkdir\") pod \"d8a56e88-900c-411c-b75c-029cf7bee318\" (UID: \"d8a56e88-900c-411c-b75c-029cf7bee318\") " Mar 16 00:26:34 crc kubenswrapper[4816]: I0316 00:26:34.616269 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xshg5\" (UniqueName: \"kubernetes.io/projected/d8a56e88-900c-411c-b75c-029cf7bee318-kube-api-access-xshg5\") pod \"d8a56e88-900c-411c-b75c-029cf7bee318\" (UID: \"d8a56e88-900c-411c-b75c-029cf7bee318\") " Mar 16 00:26:34 crc kubenswrapper[4816]: I0316 00:26:34.616299 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d8a56e88-900c-411c-b75c-029cf7bee318-build-proxy-ca-bundles\") pod \"d8a56e88-900c-411c-b75c-029cf7bee318\" (UID: \"d8a56e88-900c-411c-b75c-029cf7bee318\") " Mar 16 00:26:34 crc kubenswrapper[4816]: I0316 00:26:34.616333 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-fs5z5-push\" (UniqueName: \"kubernetes.io/secret/d8a56e88-900c-411c-b75c-029cf7bee318-builder-dockercfg-fs5z5-push\") pod \"d8a56e88-900c-411c-b75c-029cf7bee318\" (UID: \"d8a56e88-900c-411c-b75c-029cf7bee318\") " Mar 16 00:26:34 crc kubenswrapper[4816]: I0316 00:26:34.616362 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/d8a56e88-900c-411c-b75c-029cf7bee318-node-pullsecrets\") pod \"d8a56e88-900c-411c-b75c-029cf7bee318\" (UID: \"d8a56e88-900c-411c-b75c-029cf7bee318\") " Mar 16 00:26:34 crc kubenswrapper[4816]: I0316 00:26:34.616380 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/d8a56e88-900c-411c-b75c-029cf7bee318-build-system-configs\") pod \"d8a56e88-900c-411c-b75c-029cf7bee318\" (UID: \"d8a56e88-900c-411c-b75c-029cf7bee318\") " Mar 16 00:26:34 crc kubenswrapper[4816]: I0316 00:26:34.616421 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/d8a56e88-900c-411c-b75c-029cf7bee318-container-storage-root\") pod \"d8a56e88-900c-411c-b75c-029cf7bee318\" (UID: \"d8a56e88-900c-411c-b75c-029cf7bee318\") " Mar 16 00:26:34 crc kubenswrapper[4816]: I0316 00:26:34.616440 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d8a56e88-900c-411c-b75c-029cf7bee318-build-ca-bundles\") pod \"d8a56e88-900c-411c-b75c-029cf7bee318\" (UID: \"d8a56e88-900c-411c-b75c-029cf7bee318\") " Mar 16 00:26:34 crc kubenswrapper[4816]: I0316 00:26:34.616661 4816 reconciler_common.go:293] "Volume detached for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/d8a56e88-900c-411c-b75c-029cf7bee318-container-storage-run\") on node \"crc\" DevicePath \"\"" Mar 16 00:26:34 crc kubenswrapper[4816]: I0316 00:26:34.616673 4816 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-fs5z5-pull\" (UniqueName: \"kubernetes.io/secret/d8a56e88-900c-411c-b75c-029cf7bee318-builder-dockercfg-fs5z5-pull\") on node \"crc\" DevicePath \"\"" Mar 16 00:26:34 crc kubenswrapper[4816]: I0316 00:26:34.617359 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d8a56e88-900c-411c-b75c-029cf7bee318-build-ca-bundles" (OuterVolumeSpecName: "build-ca-bundles") pod "d8a56e88-900c-411c-b75c-029cf7bee318" (UID: "d8a56e88-900c-411c-b75c-029cf7bee318"). InnerVolumeSpecName "build-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:26:34 crc kubenswrapper[4816]: I0316 00:26:34.617436 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d8a56e88-900c-411c-b75c-029cf7bee318-node-pullsecrets" (OuterVolumeSpecName: "node-pullsecrets") pod "d8a56e88-900c-411c-b75c-029cf7bee318" (UID: "d8a56e88-900c-411c-b75c-029cf7bee318"). InnerVolumeSpecName "node-pullsecrets". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 16 00:26:34 crc kubenswrapper[4816]: I0316 00:26:34.617854 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d8a56e88-900c-411c-b75c-029cf7bee318-build-system-configs" (OuterVolumeSpecName: "build-system-configs") pod "d8a56e88-900c-411c-b75c-029cf7bee318" (UID: "d8a56e88-900c-411c-b75c-029cf7bee318"). InnerVolumeSpecName "build-system-configs". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:26:34 crc kubenswrapper[4816]: I0316 00:26:34.618050 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d8a56e88-900c-411c-b75c-029cf7bee318-build-proxy-ca-bundles" (OuterVolumeSpecName: "build-proxy-ca-bundles") pod "d8a56e88-900c-411c-b75c-029cf7bee318" (UID: "d8a56e88-900c-411c-b75c-029cf7bee318"). InnerVolumeSpecName "build-proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:26:34 crc kubenswrapper[4816]: I0316 00:26:34.620734 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d8a56e88-900c-411c-b75c-029cf7bee318-kube-api-access-xshg5" (OuterVolumeSpecName: "kube-api-access-xshg5") pod "d8a56e88-900c-411c-b75c-029cf7bee318" (UID: "d8a56e88-900c-411c-b75c-029cf7bee318"). InnerVolumeSpecName "kube-api-access-xshg5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:26:34 crc kubenswrapper[4816]: I0316 00:26:34.621250 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d8a56e88-900c-411c-b75c-029cf7bee318-buildworkdir" (OuterVolumeSpecName: "buildworkdir") pod "d8a56e88-900c-411c-b75c-029cf7bee318" (UID: "d8a56e88-900c-411c-b75c-029cf7bee318"). InnerVolumeSpecName "buildworkdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 16 00:26:34 crc kubenswrapper[4816]: I0316 00:26:34.626784 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d8a56e88-900c-411c-b75c-029cf7bee318-builder-dockercfg-fs5z5-push" (OuterVolumeSpecName: "builder-dockercfg-fs5z5-push") pod "d8a56e88-900c-411c-b75c-029cf7bee318" (UID: "d8a56e88-900c-411c-b75c-029cf7bee318"). InnerVolumeSpecName "builder-dockercfg-fs5z5-push". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 16 00:26:34 crc kubenswrapper[4816]: I0316 00:26:34.681789 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d8a56e88-900c-411c-b75c-029cf7bee318-build-blob-cache" (OuterVolumeSpecName: "build-blob-cache") pod "d8a56e88-900c-411c-b75c-029cf7bee318" (UID: "d8a56e88-900c-411c-b75c-029cf7bee318"). InnerVolumeSpecName "build-blob-cache". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 16 00:26:34 crc kubenswrapper[4816]: I0316 00:26:34.718529 4816 reconciler_common.go:293] "Volume detached for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d8a56e88-900c-411c-b75c-029cf7bee318-build-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 16 00:26:34 crc kubenswrapper[4816]: I0316 00:26:34.718584 4816 reconciler_common.go:293] "Volume detached for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/d8a56e88-900c-411c-b75c-029cf7bee318-build-blob-cache\") on node \"crc\" DevicePath \"\"" Mar 16 00:26:34 crc kubenswrapper[4816]: I0316 00:26:34.718596 4816 reconciler_common.go:293] "Volume detached for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/d8a56e88-900c-411c-b75c-029cf7bee318-buildworkdir\") on node \"crc\" DevicePath \"\"" Mar 16 00:26:34 crc kubenswrapper[4816]: I0316 00:26:34.718605 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xshg5\" (UniqueName: \"kubernetes.io/projected/d8a56e88-900c-411c-b75c-029cf7bee318-kube-api-access-xshg5\") on node \"crc\" DevicePath \"\"" Mar 16 00:26:34 crc kubenswrapper[4816]: I0316 00:26:34.718617 4816 reconciler_common.go:293] "Volume detached for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d8a56e88-900c-411c-b75c-029cf7bee318-build-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 16 00:26:34 crc kubenswrapper[4816]: I0316 00:26:34.718626 4816 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-fs5z5-push\" (UniqueName: \"kubernetes.io/secret/d8a56e88-900c-411c-b75c-029cf7bee318-builder-dockercfg-fs5z5-push\") on node \"crc\" DevicePath \"\"" Mar 16 00:26:34 crc kubenswrapper[4816]: I0316 00:26:34.718635 4816 reconciler_common.go:293] "Volume detached for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/d8a56e88-900c-411c-b75c-029cf7bee318-node-pullsecrets\") on node \"crc\" DevicePath \"\"" Mar 16 00:26:34 crc kubenswrapper[4816]: I0316 00:26:34.718644 4816 reconciler_common.go:293] "Volume detached for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/d8a56e88-900c-411c-b75c-029cf7bee318-build-system-configs\") on node \"crc\" DevicePath \"\"" Mar 16 00:26:35 crc kubenswrapper[4816]: I0316 00:26:35.254984 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/smart-gateway-operator-2-build" event={"ID":"d8a56e88-900c-411c-b75c-029cf7bee318","Type":"ContainerDied","Data":"9efbca1d81be5f3f1d30dc2767d52d0c0262c6327e71c0501df25c2d4f40ebae"} Mar 16 00:26:35 crc kubenswrapper[4816]: I0316 00:26:35.255034 4816 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9efbca1d81be5f3f1d30dc2767d52d0c0262c6327e71c0501df25c2d4f40ebae" Mar 16 00:26:35 crc kubenswrapper[4816]: I0316 00:26:35.255128 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/smart-gateway-operator-2-build" Mar 16 00:26:36 crc kubenswrapper[4816]: I0316 00:26:36.466916 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d8a56e88-900c-411c-b75c-029cf7bee318-container-storage-root" (OuterVolumeSpecName: "container-storage-root") pod "d8a56e88-900c-411c-b75c-029cf7bee318" (UID: "d8a56e88-900c-411c-b75c-029cf7bee318"). InnerVolumeSpecName "container-storage-root". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 16 00:26:36 crc kubenswrapper[4816]: I0316 00:26:36.542126 4816 reconciler_common.go:293] "Volume detached for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/d8a56e88-900c-411c-b75c-029cf7bee318-container-storage-root\") on node \"crc\" DevicePath \"\"" Mar 16 00:26:39 crc kubenswrapper[4816]: I0316 00:26:39.184736 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/sg-core-1-build"] Mar 16 00:26:39 crc kubenswrapper[4816]: E0316 00:26:39.185282 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2942e78f-05b7-486f-bee0-93a942f80d8a" containerName="oc" Mar 16 00:26:39 crc kubenswrapper[4816]: I0316 00:26:39.185298 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="2942e78f-05b7-486f-bee0-93a942f80d8a" containerName="oc" Mar 16 00:26:39 crc kubenswrapper[4816]: E0316 00:26:39.185313 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d8a56e88-900c-411c-b75c-029cf7bee318" containerName="docker-build" Mar 16 00:26:39 crc kubenswrapper[4816]: I0316 00:26:39.185321 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="d8a56e88-900c-411c-b75c-029cf7bee318" containerName="docker-build" Mar 16 00:26:39 crc kubenswrapper[4816]: E0316 00:26:39.185333 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d8a56e88-900c-411c-b75c-029cf7bee318" containerName="manage-dockerfile" Mar 16 00:26:39 crc kubenswrapper[4816]: I0316 00:26:39.185342 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="d8a56e88-900c-411c-b75c-029cf7bee318" containerName="manage-dockerfile" Mar 16 00:26:39 crc kubenswrapper[4816]: E0316 00:26:39.185373 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d8a56e88-900c-411c-b75c-029cf7bee318" containerName="git-clone" Mar 16 00:26:39 crc kubenswrapper[4816]: I0316 00:26:39.185382 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="d8a56e88-900c-411c-b75c-029cf7bee318" containerName="git-clone" Mar 16 00:26:39 crc kubenswrapper[4816]: I0316 00:26:39.185513 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="2942e78f-05b7-486f-bee0-93a942f80d8a" containerName="oc" Mar 16 00:26:39 crc kubenswrapper[4816]: I0316 00:26:39.185529 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="d8a56e88-900c-411c-b75c-029cf7bee318" containerName="docker-build" Mar 16 00:26:39 crc kubenswrapper[4816]: I0316 00:26:39.186408 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/sg-core-1-build" Mar 16 00:26:39 crc kubenswrapper[4816]: I0316 00:26:39.188536 4816 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"builder-dockercfg-fs5z5" Mar 16 00:26:39 crc kubenswrapper[4816]: I0316 00:26:39.189082 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"sg-core-1-ca" Mar 16 00:26:39 crc kubenswrapper[4816]: I0316 00:26:39.191027 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"sg-core-1-sys-config" Mar 16 00:26:39 crc kubenswrapper[4816]: I0316 00:26:39.191077 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"sg-core-1-global-ca" Mar 16 00:26:39 crc kubenswrapper[4816]: I0316 00:26:39.202256 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/sg-core-1-build"] Mar 16 00:26:39 crc kubenswrapper[4816]: I0316 00:26:39.284271 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-fs5z5-pull\" (UniqueName: \"kubernetes.io/secret/c622ccbe-7da3-4233-905c-bd38932a01ff-builder-dockercfg-fs5z5-pull\") pod \"sg-core-1-build\" (UID: \"c622ccbe-7da3-4233-905c-bd38932a01ff\") " pod="service-telemetry/sg-core-1-build" Mar 16 00:26:39 crc kubenswrapper[4816]: I0316 00:26:39.284322 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/c622ccbe-7da3-4233-905c-bd38932a01ff-buildcachedir\") pod \"sg-core-1-build\" (UID: \"c622ccbe-7da3-4233-905c-bd38932a01ff\") " pod="service-telemetry/sg-core-1-build" Mar 16 00:26:39 crc kubenswrapper[4816]: I0316 00:26:39.284349 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/c622ccbe-7da3-4233-905c-bd38932a01ff-container-storage-run\") pod \"sg-core-1-build\" (UID: \"c622ccbe-7da3-4233-905c-bd38932a01ff\") " pod="service-telemetry/sg-core-1-build" Mar 16 00:26:39 crc kubenswrapper[4816]: I0316 00:26:39.284688 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/c622ccbe-7da3-4233-905c-bd38932a01ff-build-proxy-ca-bundles\") pod \"sg-core-1-build\" (UID: \"c622ccbe-7da3-4233-905c-bd38932a01ff\") " pod="service-telemetry/sg-core-1-build" Mar 16 00:26:39 crc kubenswrapper[4816]: I0316 00:26:39.284757 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/c622ccbe-7da3-4233-905c-bd38932a01ff-container-storage-root\") pod \"sg-core-1-build\" (UID: \"c622ccbe-7da3-4233-905c-bd38932a01ff\") " pod="service-telemetry/sg-core-1-build" Mar 16 00:26:39 crc kubenswrapper[4816]: I0316 00:26:39.284812 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-shrhw\" (UniqueName: \"kubernetes.io/projected/c622ccbe-7da3-4233-905c-bd38932a01ff-kube-api-access-shrhw\") pod \"sg-core-1-build\" (UID: \"c622ccbe-7da3-4233-905c-bd38932a01ff\") " pod="service-telemetry/sg-core-1-build" Mar 16 00:26:39 crc kubenswrapper[4816]: I0316 00:26:39.284836 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/c622ccbe-7da3-4233-905c-bd38932a01ff-build-system-configs\") pod \"sg-core-1-build\" (UID: \"c622ccbe-7da3-4233-905c-bd38932a01ff\") " pod="service-telemetry/sg-core-1-build" Mar 16 00:26:39 crc kubenswrapper[4816]: I0316 00:26:39.284873 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/c622ccbe-7da3-4233-905c-bd38932a01ff-node-pullsecrets\") pod \"sg-core-1-build\" (UID: \"c622ccbe-7da3-4233-905c-bd38932a01ff\") " pod="service-telemetry/sg-core-1-build" Mar 16 00:26:39 crc kubenswrapper[4816]: I0316 00:26:39.284900 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/c622ccbe-7da3-4233-905c-bd38932a01ff-buildworkdir\") pod \"sg-core-1-build\" (UID: \"c622ccbe-7da3-4233-905c-bd38932a01ff\") " pod="service-telemetry/sg-core-1-build" Mar 16 00:26:39 crc kubenswrapper[4816]: I0316 00:26:39.284964 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/c622ccbe-7da3-4233-905c-bd38932a01ff-build-ca-bundles\") pod \"sg-core-1-build\" (UID: \"c622ccbe-7da3-4233-905c-bd38932a01ff\") " pod="service-telemetry/sg-core-1-build" Mar 16 00:26:39 crc kubenswrapper[4816]: I0316 00:26:39.284992 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/c622ccbe-7da3-4233-905c-bd38932a01ff-build-blob-cache\") pod \"sg-core-1-build\" (UID: \"c622ccbe-7da3-4233-905c-bd38932a01ff\") " pod="service-telemetry/sg-core-1-build" Mar 16 00:26:39 crc kubenswrapper[4816]: I0316 00:26:39.285035 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-fs5z5-push\" (UniqueName: \"kubernetes.io/secret/c622ccbe-7da3-4233-905c-bd38932a01ff-builder-dockercfg-fs5z5-push\") pod \"sg-core-1-build\" (UID: \"c622ccbe-7da3-4233-905c-bd38932a01ff\") " pod="service-telemetry/sg-core-1-build" Mar 16 00:26:39 crc kubenswrapper[4816]: I0316 00:26:39.386957 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-fs5z5-push\" (UniqueName: \"kubernetes.io/secret/c622ccbe-7da3-4233-905c-bd38932a01ff-builder-dockercfg-fs5z5-push\") pod \"sg-core-1-build\" (UID: \"c622ccbe-7da3-4233-905c-bd38932a01ff\") " pod="service-telemetry/sg-core-1-build" Mar 16 00:26:39 crc kubenswrapper[4816]: I0316 00:26:39.387067 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-fs5z5-pull\" (UniqueName: \"kubernetes.io/secret/c622ccbe-7da3-4233-905c-bd38932a01ff-builder-dockercfg-fs5z5-pull\") pod \"sg-core-1-build\" (UID: \"c622ccbe-7da3-4233-905c-bd38932a01ff\") " pod="service-telemetry/sg-core-1-build" Mar 16 00:26:39 crc kubenswrapper[4816]: I0316 00:26:39.387123 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/c622ccbe-7da3-4233-905c-bd38932a01ff-buildcachedir\") pod \"sg-core-1-build\" (UID: \"c622ccbe-7da3-4233-905c-bd38932a01ff\") " pod="service-telemetry/sg-core-1-build" Mar 16 00:26:39 crc kubenswrapper[4816]: I0316 00:26:39.387180 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/c622ccbe-7da3-4233-905c-bd38932a01ff-container-storage-run\") pod \"sg-core-1-build\" (UID: \"c622ccbe-7da3-4233-905c-bd38932a01ff\") " pod="service-telemetry/sg-core-1-build" Mar 16 00:26:39 crc kubenswrapper[4816]: I0316 00:26:39.387483 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/c622ccbe-7da3-4233-905c-bd38932a01ff-build-proxy-ca-bundles\") pod \"sg-core-1-build\" (UID: \"c622ccbe-7da3-4233-905c-bd38932a01ff\") " pod="service-telemetry/sg-core-1-build" Mar 16 00:26:39 crc kubenswrapper[4816]: I0316 00:26:39.387591 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/c622ccbe-7da3-4233-905c-bd38932a01ff-container-storage-root\") pod \"sg-core-1-build\" (UID: \"c622ccbe-7da3-4233-905c-bd38932a01ff\") " pod="service-telemetry/sg-core-1-build" Mar 16 00:26:39 crc kubenswrapper[4816]: I0316 00:26:39.387606 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/c622ccbe-7da3-4233-905c-bd38932a01ff-buildcachedir\") pod \"sg-core-1-build\" (UID: \"c622ccbe-7da3-4233-905c-bd38932a01ff\") " pod="service-telemetry/sg-core-1-build" Mar 16 00:26:39 crc kubenswrapper[4816]: I0316 00:26:39.387660 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-shrhw\" (UniqueName: \"kubernetes.io/projected/c622ccbe-7da3-4233-905c-bd38932a01ff-kube-api-access-shrhw\") pod \"sg-core-1-build\" (UID: \"c622ccbe-7da3-4233-905c-bd38932a01ff\") " pod="service-telemetry/sg-core-1-build" Mar 16 00:26:39 crc kubenswrapper[4816]: I0316 00:26:39.387705 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/c622ccbe-7da3-4233-905c-bd38932a01ff-build-system-configs\") pod \"sg-core-1-build\" (UID: \"c622ccbe-7da3-4233-905c-bd38932a01ff\") " pod="service-telemetry/sg-core-1-build" Mar 16 00:26:39 crc kubenswrapper[4816]: I0316 00:26:39.387752 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/c622ccbe-7da3-4233-905c-bd38932a01ff-node-pullsecrets\") pod \"sg-core-1-build\" (UID: \"c622ccbe-7da3-4233-905c-bd38932a01ff\") " pod="service-telemetry/sg-core-1-build" Mar 16 00:26:39 crc kubenswrapper[4816]: I0316 00:26:39.387795 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/c622ccbe-7da3-4233-905c-bd38932a01ff-buildworkdir\") pod \"sg-core-1-build\" (UID: \"c622ccbe-7da3-4233-905c-bd38932a01ff\") " pod="service-telemetry/sg-core-1-build" Mar 16 00:26:39 crc kubenswrapper[4816]: I0316 00:26:39.387845 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/c622ccbe-7da3-4233-905c-bd38932a01ff-build-ca-bundles\") pod \"sg-core-1-build\" (UID: \"c622ccbe-7da3-4233-905c-bd38932a01ff\") " pod="service-telemetry/sg-core-1-build" Mar 16 00:26:39 crc kubenswrapper[4816]: I0316 00:26:39.387884 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/c622ccbe-7da3-4233-905c-bd38932a01ff-build-blob-cache\") pod \"sg-core-1-build\" (UID: \"c622ccbe-7da3-4233-905c-bd38932a01ff\") " pod="service-telemetry/sg-core-1-build" Mar 16 00:26:39 crc kubenswrapper[4816]: I0316 00:26:39.388367 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/c622ccbe-7da3-4233-905c-bd38932a01ff-container-storage-run\") pod \"sg-core-1-build\" (UID: \"c622ccbe-7da3-4233-905c-bd38932a01ff\") " pod="service-telemetry/sg-core-1-build" Mar 16 00:26:39 crc kubenswrapper[4816]: I0316 00:26:39.388584 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/c622ccbe-7da3-4233-905c-bd38932a01ff-node-pullsecrets\") pod \"sg-core-1-build\" (UID: \"c622ccbe-7da3-4233-905c-bd38932a01ff\") " pod="service-telemetry/sg-core-1-build" Mar 16 00:26:39 crc kubenswrapper[4816]: I0316 00:26:39.388714 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/c622ccbe-7da3-4233-905c-bd38932a01ff-build-blob-cache\") pod \"sg-core-1-build\" (UID: \"c622ccbe-7da3-4233-905c-bd38932a01ff\") " pod="service-telemetry/sg-core-1-build" Mar 16 00:26:39 crc kubenswrapper[4816]: I0316 00:26:39.388889 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/c622ccbe-7da3-4233-905c-bd38932a01ff-build-system-configs\") pod \"sg-core-1-build\" (UID: \"c622ccbe-7da3-4233-905c-bd38932a01ff\") " pod="service-telemetry/sg-core-1-build" Mar 16 00:26:39 crc kubenswrapper[4816]: I0316 00:26:39.389118 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/c622ccbe-7da3-4233-905c-bd38932a01ff-build-proxy-ca-bundles\") pod \"sg-core-1-build\" (UID: \"c622ccbe-7da3-4233-905c-bd38932a01ff\") " pod="service-telemetry/sg-core-1-build" Mar 16 00:26:39 crc kubenswrapper[4816]: I0316 00:26:39.389851 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/c622ccbe-7da3-4233-905c-bd38932a01ff-container-storage-root\") pod \"sg-core-1-build\" (UID: \"c622ccbe-7da3-4233-905c-bd38932a01ff\") " pod="service-telemetry/sg-core-1-build" Mar 16 00:26:39 crc kubenswrapper[4816]: I0316 00:26:39.390090 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/c622ccbe-7da3-4233-905c-bd38932a01ff-buildworkdir\") pod \"sg-core-1-build\" (UID: \"c622ccbe-7da3-4233-905c-bd38932a01ff\") " pod="service-telemetry/sg-core-1-build" Mar 16 00:26:39 crc kubenswrapper[4816]: I0316 00:26:39.391063 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/c622ccbe-7da3-4233-905c-bd38932a01ff-build-ca-bundles\") pod \"sg-core-1-build\" (UID: \"c622ccbe-7da3-4233-905c-bd38932a01ff\") " pod="service-telemetry/sg-core-1-build" Mar 16 00:26:39 crc kubenswrapper[4816]: I0316 00:26:39.393439 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-fs5z5-push\" (UniqueName: \"kubernetes.io/secret/c622ccbe-7da3-4233-905c-bd38932a01ff-builder-dockercfg-fs5z5-push\") pod \"sg-core-1-build\" (UID: \"c622ccbe-7da3-4233-905c-bd38932a01ff\") " pod="service-telemetry/sg-core-1-build" Mar 16 00:26:39 crc kubenswrapper[4816]: I0316 00:26:39.394842 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-fs5z5-pull\" (UniqueName: \"kubernetes.io/secret/c622ccbe-7da3-4233-905c-bd38932a01ff-builder-dockercfg-fs5z5-pull\") pod \"sg-core-1-build\" (UID: \"c622ccbe-7da3-4233-905c-bd38932a01ff\") " pod="service-telemetry/sg-core-1-build" Mar 16 00:26:39 crc kubenswrapper[4816]: I0316 00:26:39.418318 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-shrhw\" (UniqueName: \"kubernetes.io/projected/c622ccbe-7da3-4233-905c-bd38932a01ff-kube-api-access-shrhw\") pod \"sg-core-1-build\" (UID: \"c622ccbe-7da3-4233-905c-bd38932a01ff\") " pod="service-telemetry/sg-core-1-build" Mar 16 00:26:39 crc kubenswrapper[4816]: I0316 00:26:39.549315 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/sg-core-1-build" Mar 16 00:26:39 crc kubenswrapper[4816]: I0316 00:26:39.773039 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/sg-core-1-build"] Mar 16 00:26:40 crc kubenswrapper[4816]: I0316 00:26:40.291671 4816 generic.go:334] "Generic (PLEG): container finished" podID="c622ccbe-7da3-4233-905c-bd38932a01ff" containerID="262c685bc20e1d8192cb9a33c4a9c13cbf0d3368b0f0fd15e2dda89d5a3189fe" exitCode=0 Mar 16 00:26:40 crc kubenswrapper[4816]: I0316 00:26:40.291767 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/sg-core-1-build" event={"ID":"c622ccbe-7da3-4233-905c-bd38932a01ff","Type":"ContainerDied","Data":"262c685bc20e1d8192cb9a33c4a9c13cbf0d3368b0f0fd15e2dda89d5a3189fe"} Mar 16 00:26:40 crc kubenswrapper[4816]: I0316 00:26:40.291952 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/sg-core-1-build" event={"ID":"c622ccbe-7da3-4233-905c-bd38932a01ff","Type":"ContainerStarted","Data":"8c4febb72e74d39d24c5326cf7e1359f6880581dd8b05174a072a1a50b9f6d8b"} Mar 16 00:26:41 crc kubenswrapper[4816]: I0316 00:26:41.300078 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/sg-core-1-build" event={"ID":"c622ccbe-7da3-4233-905c-bd38932a01ff","Type":"ContainerStarted","Data":"53231c94bb394a06af6227e423c52884f2e3bec37998f4318e285a21da150a08"} Mar 16 00:26:41 crc kubenswrapper[4816]: I0316 00:26:41.323943 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/sg-core-1-build" podStartSLOduration=2.323920757 podStartE2EDuration="2.323920757s" podCreationTimestamp="2026-03-16 00:26:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-16 00:26:41.320154813 +0000 UTC m=+1194.416454756" watchObservedRunningTime="2026-03-16 00:26:41.323920757 +0000 UTC m=+1194.420220710" Mar 16 00:26:49 crc kubenswrapper[4816]: I0316 00:26:49.580526 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["service-telemetry/sg-core-1-build"] Mar 16 00:26:49 crc kubenswrapper[4816]: I0316 00:26:49.581238 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="service-telemetry/sg-core-1-build" podUID="c622ccbe-7da3-4233-905c-bd38932a01ff" containerName="docker-build" containerID="cri-o://53231c94bb394a06af6227e423c52884f2e3bec37998f4318e285a21da150a08" gracePeriod=30 Mar 16 00:26:49 crc kubenswrapper[4816]: I0316 00:26:49.940263 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_sg-core-1-build_c622ccbe-7da3-4233-905c-bd38932a01ff/docker-build/0.log" Mar 16 00:26:49 crc kubenswrapper[4816]: I0316 00:26:49.941078 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/sg-core-1-build" Mar 16 00:26:50 crc kubenswrapper[4816]: I0316 00:26:50.159872 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-fs5z5-pull\" (UniqueName: \"kubernetes.io/secret/c622ccbe-7da3-4233-905c-bd38932a01ff-builder-dockercfg-fs5z5-pull\") pod \"c622ccbe-7da3-4233-905c-bd38932a01ff\" (UID: \"c622ccbe-7da3-4233-905c-bd38932a01ff\") " Mar 16 00:26:50 crc kubenswrapper[4816]: I0316 00:26:50.160379 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/c622ccbe-7da3-4233-905c-bd38932a01ff-build-ca-bundles\") pod \"c622ccbe-7da3-4233-905c-bd38932a01ff\" (UID: \"c622ccbe-7da3-4233-905c-bd38932a01ff\") " Mar 16 00:26:50 crc kubenswrapper[4816]: I0316 00:26:50.160614 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-shrhw\" (UniqueName: \"kubernetes.io/projected/c622ccbe-7da3-4233-905c-bd38932a01ff-kube-api-access-shrhw\") pod \"c622ccbe-7da3-4233-905c-bd38932a01ff\" (UID: \"c622ccbe-7da3-4233-905c-bd38932a01ff\") " Mar 16 00:26:50 crc kubenswrapper[4816]: I0316 00:26:50.160692 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/c622ccbe-7da3-4233-905c-bd38932a01ff-build-system-configs\") pod \"c622ccbe-7da3-4233-905c-bd38932a01ff\" (UID: \"c622ccbe-7da3-4233-905c-bd38932a01ff\") " Mar 16 00:26:50 crc kubenswrapper[4816]: I0316 00:26:50.160727 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-fs5z5-push\" (UniqueName: \"kubernetes.io/secret/c622ccbe-7da3-4233-905c-bd38932a01ff-builder-dockercfg-fs5z5-push\") pod \"c622ccbe-7da3-4233-905c-bd38932a01ff\" (UID: \"c622ccbe-7da3-4233-905c-bd38932a01ff\") " Mar 16 00:26:50 crc kubenswrapper[4816]: I0316 00:26:50.160916 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c622ccbe-7da3-4233-905c-bd38932a01ff-buildcachedir" (OuterVolumeSpecName: "buildcachedir") pod "c622ccbe-7da3-4233-905c-bd38932a01ff" (UID: "c622ccbe-7da3-4233-905c-bd38932a01ff"). InnerVolumeSpecName "buildcachedir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 16 00:26:50 crc kubenswrapper[4816]: I0316 00:26:50.161417 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c622ccbe-7da3-4233-905c-bd38932a01ff-build-ca-bundles" (OuterVolumeSpecName: "build-ca-bundles") pod "c622ccbe-7da3-4233-905c-bd38932a01ff" (UID: "c622ccbe-7da3-4233-905c-bd38932a01ff"). InnerVolumeSpecName "build-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:26:50 crc kubenswrapper[4816]: I0316 00:26:50.161600 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/c622ccbe-7da3-4233-905c-bd38932a01ff-buildcachedir\") pod \"c622ccbe-7da3-4233-905c-bd38932a01ff\" (UID: \"c622ccbe-7da3-4233-905c-bd38932a01ff\") " Mar 16 00:26:50 crc kubenswrapper[4816]: I0316 00:26:50.161683 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/c622ccbe-7da3-4233-905c-bd38932a01ff-build-proxy-ca-bundles\") pod \"c622ccbe-7da3-4233-905c-bd38932a01ff\" (UID: \"c622ccbe-7da3-4233-905c-bd38932a01ff\") " Mar 16 00:26:50 crc kubenswrapper[4816]: I0316 00:26:50.161740 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/c622ccbe-7da3-4233-905c-bd38932a01ff-buildworkdir\") pod \"c622ccbe-7da3-4233-905c-bd38932a01ff\" (UID: \"c622ccbe-7da3-4233-905c-bd38932a01ff\") " Mar 16 00:26:50 crc kubenswrapper[4816]: I0316 00:26:50.161903 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/c622ccbe-7da3-4233-905c-bd38932a01ff-build-blob-cache\") pod \"c622ccbe-7da3-4233-905c-bd38932a01ff\" (UID: \"c622ccbe-7da3-4233-905c-bd38932a01ff\") " Mar 16 00:26:50 crc kubenswrapper[4816]: I0316 00:26:50.162149 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/c622ccbe-7da3-4233-905c-bd38932a01ff-container-storage-run\") pod \"c622ccbe-7da3-4233-905c-bd38932a01ff\" (UID: \"c622ccbe-7da3-4233-905c-bd38932a01ff\") " Mar 16 00:26:50 crc kubenswrapper[4816]: I0316 00:26:50.162274 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/c622ccbe-7da3-4233-905c-bd38932a01ff-container-storage-root\") pod \"c622ccbe-7da3-4233-905c-bd38932a01ff\" (UID: \"c622ccbe-7da3-4233-905c-bd38932a01ff\") " Mar 16 00:26:50 crc kubenswrapper[4816]: I0316 00:26:50.162328 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/c622ccbe-7da3-4233-905c-bd38932a01ff-node-pullsecrets\") pod \"c622ccbe-7da3-4233-905c-bd38932a01ff\" (UID: \"c622ccbe-7da3-4233-905c-bd38932a01ff\") " Mar 16 00:26:50 crc kubenswrapper[4816]: I0316 00:26:50.162422 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c622ccbe-7da3-4233-905c-bd38932a01ff-buildworkdir" (OuterVolumeSpecName: "buildworkdir") pod "c622ccbe-7da3-4233-905c-bd38932a01ff" (UID: "c622ccbe-7da3-4233-905c-bd38932a01ff"). InnerVolumeSpecName "buildworkdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 16 00:26:50 crc kubenswrapper[4816]: I0316 00:26:50.162427 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c622ccbe-7da3-4233-905c-bd38932a01ff-build-system-configs" (OuterVolumeSpecName: "build-system-configs") pod "c622ccbe-7da3-4233-905c-bd38932a01ff" (UID: "c622ccbe-7da3-4233-905c-bd38932a01ff"). InnerVolumeSpecName "build-system-configs". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:26:50 crc kubenswrapper[4816]: I0316 00:26:50.162921 4816 reconciler_common.go:293] "Volume detached for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/c622ccbe-7da3-4233-905c-bd38932a01ff-build-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 16 00:26:50 crc kubenswrapper[4816]: I0316 00:26:50.163170 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c622ccbe-7da3-4233-905c-bd38932a01ff-container-storage-run" (OuterVolumeSpecName: "container-storage-run") pod "c622ccbe-7da3-4233-905c-bd38932a01ff" (UID: "c622ccbe-7da3-4233-905c-bd38932a01ff"). InnerVolumeSpecName "container-storage-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 16 00:26:50 crc kubenswrapper[4816]: I0316 00:26:50.163201 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c622ccbe-7da3-4233-905c-bd38932a01ff-node-pullsecrets" (OuterVolumeSpecName: "node-pullsecrets") pod "c622ccbe-7da3-4233-905c-bd38932a01ff" (UID: "c622ccbe-7da3-4233-905c-bd38932a01ff"). InnerVolumeSpecName "node-pullsecrets". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 16 00:26:50 crc kubenswrapper[4816]: I0316 00:26:50.163098 4816 reconciler_common.go:293] "Volume detached for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/c622ccbe-7da3-4233-905c-bd38932a01ff-build-system-configs\") on node \"crc\" DevicePath \"\"" Mar 16 00:26:50 crc kubenswrapper[4816]: I0316 00:26:50.163341 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c622ccbe-7da3-4233-905c-bd38932a01ff-build-proxy-ca-bundles" (OuterVolumeSpecName: "build-proxy-ca-bundles") pod "c622ccbe-7da3-4233-905c-bd38932a01ff" (UID: "c622ccbe-7da3-4233-905c-bd38932a01ff"). InnerVolumeSpecName "build-proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:26:50 crc kubenswrapper[4816]: I0316 00:26:50.163347 4816 reconciler_common.go:293] "Volume detached for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/c622ccbe-7da3-4233-905c-bd38932a01ff-buildcachedir\") on node \"crc\" DevicePath \"\"" Mar 16 00:26:50 crc kubenswrapper[4816]: I0316 00:26:50.163377 4816 reconciler_common.go:293] "Volume detached for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/c622ccbe-7da3-4233-905c-bd38932a01ff-buildworkdir\") on node \"crc\" DevicePath \"\"" Mar 16 00:26:50 crc kubenswrapper[4816]: I0316 00:26:50.166100 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c622ccbe-7da3-4233-905c-bd38932a01ff-builder-dockercfg-fs5z5-pull" (OuterVolumeSpecName: "builder-dockercfg-fs5z5-pull") pod "c622ccbe-7da3-4233-905c-bd38932a01ff" (UID: "c622ccbe-7da3-4233-905c-bd38932a01ff"). InnerVolumeSpecName "builder-dockercfg-fs5z5-pull". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 16 00:26:50 crc kubenswrapper[4816]: I0316 00:26:50.166959 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c622ccbe-7da3-4233-905c-bd38932a01ff-kube-api-access-shrhw" (OuterVolumeSpecName: "kube-api-access-shrhw") pod "c622ccbe-7da3-4233-905c-bd38932a01ff" (UID: "c622ccbe-7da3-4233-905c-bd38932a01ff"). InnerVolumeSpecName "kube-api-access-shrhw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:26:50 crc kubenswrapper[4816]: I0316 00:26:50.169776 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c622ccbe-7da3-4233-905c-bd38932a01ff-builder-dockercfg-fs5z5-push" (OuterVolumeSpecName: "builder-dockercfg-fs5z5-push") pod "c622ccbe-7da3-4233-905c-bd38932a01ff" (UID: "c622ccbe-7da3-4233-905c-bd38932a01ff"). InnerVolumeSpecName "builder-dockercfg-fs5z5-push". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 16 00:26:50 crc kubenswrapper[4816]: I0316 00:26:50.264819 4816 reconciler_common.go:293] "Volume detached for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/c622ccbe-7da3-4233-905c-bd38932a01ff-node-pullsecrets\") on node \"crc\" DevicePath \"\"" Mar 16 00:26:50 crc kubenswrapper[4816]: I0316 00:26:50.264856 4816 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-fs5z5-pull\" (UniqueName: \"kubernetes.io/secret/c622ccbe-7da3-4233-905c-bd38932a01ff-builder-dockercfg-fs5z5-pull\") on node \"crc\" DevicePath \"\"" Mar 16 00:26:50 crc kubenswrapper[4816]: I0316 00:26:50.264871 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-shrhw\" (UniqueName: \"kubernetes.io/projected/c622ccbe-7da3-4233-905c-bd38932a01ff-kube-api-access-shrhw\") on node \"crc\" DevicePath \"\"" Mar 16 00:26:50 crc kubenswrapper[4816]: I0316 00:26:50.264883 4816 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-fs5z5-push\" (UniqueName: \"kubernetes.io/secret/c622ccbe-7da3-4233-905c-bd38932a01ff-builder-dockercfg-fs5z5-push\") on node \"crc\" DevicePath \"\"" Mar 16 00:26:50 crc kubenswrapper[4816]: I0316 00:26:50.264894 4816 reconciler_common.go:293] "Volume detached for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/c622ccbe-7da3-4233-905c-bd38932a01ff-build-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 16 00:26:50 crc kubenswrapper[4816]: I0316 00:26:50.264904 4816 reconciler_common.go:293] "Volume detached for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/c622ccbe-7da3-4233-905c-bd38932a01ff-container-storage-run\") on node \"crc\" DevicePath \"\"" Mar 16 00:26:50 crc kubenswrapper[4816]: I0316 00:26:50.271889 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c622ccbe-7da3-4233-905c-bd38932a01ff-build-blob-cache" (OuterVolumeSpecName: "build-blob-cache") pod "c622ccbe-7da3-4233-905c-bd38932a01ff" (UID: "c622ccbe-7da3-4233-905c-bd38932a01ff"). InnerVolumeSpecName "build-blob-cache". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 16 00:26:50 crc kubenswrapper[4816]: I0316 00:26:50.365510 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_sg-core-1-build_c622ccbe-7da3-4233-905c-bd38932a01ff/docker-build/0.log" Mar 16 00:26:50 crc kubenswrapper[4816]: I0316 00:26:50.365734 4816 reconciler_common.go:293] "Volume detached for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/c622ccbe-7da3-4233-905c-bd38932a01ff-build-blob-cache\") on node \"crc\" DevicePath \"\"" Mar 16 00:26:50 crc kubenswrapper[4816]: I0316 00:26:50.366363 4816 generic.go:334] "Generic (PLEG): container finished" podID="c622ccbe-7da3-4233-905c-bd38932a01ff" containerID="53231c94bb394a06af6227e423c52884f2e3bec37998f4318e285a21da150a08" exitCode=1 Mar 16 00:26:50 crc kubenswrapper[4816]: I0316 00:26:50.366418 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/sg-core-1-build" event={"ID":"c622ccbe-7da3-4233-905c-bd38932a01ff","Type":"ContainerDied","Data":"53231c94bb394a06af6227e423c52884f2e3bec37998f4318e285a21da150a08"} Mar 16 00:26:50 crc kubenswrapper[4816]: I0316 00:26:50.366478 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/sg-core-1-build" event={"ID":"c622ccbe-7da3-4233-905c-bd38932a01ff","Type":"ContainerDied","Data":"8c4febb72e74d39d24c5326cf7e1359f6880581dd8b05174a072a1a50b9f6d8b"} Mar 16 00:26:50 crc kubenswrapper[4816]: I0316 00:26:50.366476 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/sg-core-1-build" Mar 16 00:26:50 crc kubenswrapper[4816]: I0316 00:26:50.366501 4816 scope.go:117] "RemoveContainer" containerID="53231c94bb394a06af6227e423c52884f2e3bec37998f4318e285a21da150a08" Mar 16 00:26:50 crc kubenswrapper[4816]: I0316 00:26:50.410628 4816 scope.go:117] "RemoveContainer" containerID="262c685bc20e1d8192cb9a33c4a9c13cbf0d3368b0f0fd15e2dda89d5a3189fe" Mar 16 00:26:50 crc kubenswrapper[4816]: I0316 00:26:50.433889 4816 scope.go:117] "RemoveContainer" containerID="53231c94bb394a06af6227e423c52884f2e3bec37998f4318e285a21da150a08" Mar 16 00:26:50 crc kubenswrapper[4816]: E0316 00:26:50.434355 4816 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"53231c94bb394a06af6227e423c52884f2e3bec37998f4318e285a21da150a08\": container with ID starting with 53231c94bb394a06af6227e423c52884f2e3bec37998f4318e285a21da150a08 not found: ID does not exist" containerID="53231c94bb394a06af6227e423c52884f2e3bec37998f4318e285a21da150a08" Mar 16 00:26:50 crc kubenswrapper[4816]: I0316 00:26:50.434410 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"53231c94bb394a06af6227e423c52884f2e3bec37998f4318e285a21da150a08"} err="failed to get container status \"53231c94bb394a06af6227e423c52884f2e3bec37998f4318e285a21da150a08\": rpc error: code = NotFound desc = could not find container \"53231c94bb394a06af6227e423c52884f2e3bec37998f4318e285a21da150a08\": container with ID starting with 53231c94bb394a06af6227e423c52884f2e3bec37998f4318e285a21da150a08 not found: ID does not exist" Mar 16 00:26:50 crc kubenswrapper[4816]: I0316 00:26:50.434439 4816 scope.go:117] "RemoveContainer" containerID="262c685bc20e1d8192cb9a33c4a9c13cbf0d3368b0f0fd15e2dda89d5a3189fe" Mar 16 00:26:50 crc kubenswrapper[4816]: E0316 00:26:50.434947 4816 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"262c685bc20e1d8192cb9a33c4a9c13cbf0d3368b0f0fd15e2dda89d5a3189fe\": container with ID starting with 262c685bc20e1d8192cb9a33c4a9c13cbf0d3368b0f0fd15e2dda89d5a3189fe not found: ID does not exist" containerID="262c685bc20e1d8192cb9a33c4a9c13cbf0d3368b0f0fd15e2dda89d5a3189fe" Mar 16 00:26:50 crc kubenswrapper[4816]: I0316 00:26:50.435000 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"262c685bc20e1d8192cb9a33c4a9c13cbf0d3368b0f0fd15e2dda89d5a3189fe"} err="failed to get container status \"262c685bc20e1d8192cb9a33c4a9c13cbf0d3368b0f0fd15e2dda89d5a3189fe\": rpc error: code = NotFound desc = could not find container \"262c685bc20e1d8192cb9a33c4a9c13cbf0d3368b0f0fd15e2dda89d5a3189fe\": container with ID starting with 262c685bc20e1d8192cb9a33c4a9c13cbf0d3368b0f0fd15e2dda89d5a3189fe not found: ID does not exist" Mar 16 00:26:50 crc kubenswrapper[4816]: I0316 00:26:50.608533 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c622ccbe-7da3-4233-905c-bd38932a01ff-container-storage-root" (OuterVolumeSpecName: "container-storage-root") pod "c622ccbe-7da3-4233-905c-bd38932a01ff" (UID: "c622ccbe-7da3-4233-905c-bd38932a01ff"). InnerVolumeSpecName "container-storage-root". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 16 00:26:50 crc kubenswrapper[4816]: I0316 00:26:50.668922 4816 reconciler_common.go:293] "Volume detached for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/c622ccbe-7da3-4233-905c-bd38932a01ff-container-storage-root\") on node \"crc\" DevicePath \"\"" Mar 16 00:26:50 crc kubenswrapper[4816]: I0316 00:26:50.711786 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["service-telemetry/sg-core-1-build"] Mar 16 00:26:50 crc kubenswrapper[4816]: I0316 00:26:50.719785 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["service-telemetry/sg-core-1-build"] Mar 16 00:26:51 crc kubenswrapper[4816]: I0316 00:26:51.330734 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/sg-core-2-build"] Mar 16 00:26:51 crc kubenswrapper[4816]: E0316 00:26:51.330992 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c622ccbe-7da3-4233-905c-bd38932a01ff" containerName="docker-build" Mar 16 00:26:51 crc kubenswrapper[4816]: I0316 00:26:51.331006 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="c622ccbe-7da3-4233-905c-bd38932a01ff" containerName="docker-build" Mar 16 00:26:51 crc kubenswrapper[4816]: E0316 00:26:51.331022 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c622ccbe-7da3-4233-905c-bd38932a01ff" containerName="manage-dockerfile" Mar 16 00:26:51 crc kubenswrapper[4816]: I0316 00:26:51.331031 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="c622ccbe-7da3-4233-905c-bd38932a01ff" containerName="manage-dockerfile" Mar 16 00:26:51 crc kubenswrapper[4816]: I0316 00:26:51.331168 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="c622ccbe-7da3-4233-905c-bd38932a01ff" containerName="docker-build" Mar 16 00:26:51 crc kubenswrapper[4816]: I0316 00:26:51.332078 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/sg-core-2-build" Mar 16 00:26:51 crc kubenswrapper[4816]: I0316 00:26:51.334430 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"sg-core-2-global-ca" Mar 16 00:26:51 crc kubenswrapper[4816]: I0316 00:26:51.334638 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"sg-core-2-sys-config" Mar 16 00:26:51 crc kubenswrapper[4816]: I0316 00:26:51.334800 4816 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"builder-dockercfg-fs5z5" Mar 16 00:26:51 crc kubenswrapper[4816]: I0316 00:26:51.334996 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"sg-core-2-ca" Mar 16 00:26:51 crc kubenswrapper[4816]: I0316 00:26:51.358198 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/sg-core-2-build"] Mar 16 00:26:51 crc kubenswrapper[4816]: I0316 00:26:51.376799 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/f1394889-b25e-4a90-ad3b-651e20e8ad20-build-ca-bundles\") pod \"sg-core-2-build\" (UID: \"f1394889-b25e-4a90-ad3b-651e20e8ad20\") " pod="service-telemetry/sg-core-2-build" Mar 16 00:26:51 crc kubenswrapper[4816]: I0316 00:26:51.376866 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/f1394889-b25e-4a90-ad3b-651e20e8ad20-build-system-configs\") pod \"sg-core-2-build\" (UID: \"f1394889-b25e-4a90-ad3b-651e20e8ad20\") " pod="service-telemetry/sg-core-2-build" Mar 16 00:26:51 crc kubenswrapper[4816]: I0316 00:26:51.376995 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/f1394889-b25e-4a90-ad3b-651e20e8ad20-container-storage-root\") pod \"sg-core-2-build\" (UID: \"f1394889-b25e-4a90-ad3b-651e20e8ad20\") " pod="service-telemetry/sg-core-2-build" Mar 16 00:26:51 crc kubenswrapper[4816]: I0316 00:26:51.377052 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/f1394889-b25e-4a90-ad3b-651e20e8ad20-build-blob-cache\") pod \"sg-core-2-build\" (UID: \"f1394889-b25e-4a90-ad3b-651e20e8ad20\") " pod="service-telemetry/sg-core-2-build" Mar 16 00:26:51 crc kubenswrapper[4816]: I0316 00:26:51.377121 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/f1394889-b25e-4a90-ad3b-651e20e8ad20-buildworkdir\") pod \"sg-core-2-build\" (UID: \"f1394889-b25e-4a90-ad3b-651e20e8ad20\") " pod="service-telemetry/sg-core-2-build" Mar 16 00:26:51 crc kubenswrapper[4816]: I0316 00:26:51.377152 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-fs5z5-push\" (UniqueName: \"kubernetes.io/secret/f1394889-b25e-4a90-ad3b-651e20e8ad20-builder-dockercfg-fs5z5-push\") pod \"sg-core-2-build\" (UID: \"f1394889-b25e-4a90-ad3b-651e20e8ad20\") " pod="service-telemetry/sg-core-2-build" Mar 16 00:26:51 crc kubenswrapper[4816]: I0316 00:26:51.377212 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/f1394889-b25e-4a90-ad3b-651e20e8ad20-build-proxy-ca-bundles\") pod \"sg-core-2-build\" (UID: \"f1394889-b25e-4a90-ad3b-651e20e8ad20\") " pod="service-telemetry/sg-core-2-build" Mar 16 00:26:51 crc kubenswrapper[4816]: I0316 00:26:51.377258 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/f1394889-b25e-4a90-ad3b-651e20e8ad20-buildcachedir\") pod \"sg-core-2-build\" (UID: \"f1394889-b25e-4a90-ad3b-651e20e8ad20\") " pod="service-telemetry/sg-core-2-build" Mar 16 00:26:51 crc kubenswrapper[4816]: I0316 00:26:51.377277 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r6hxr\" (UniqueName: \"kubernetes.io/projected/f1394889-b25e-4a90-ad3b-651e20e8ad20-kube-api-access-r6hxr\") pod \"sg-core-2-build\" (UID: \"f1394889-b25e-4a90-ad3b-651e20e8ad20\") " pod="service-telemetry/sg-core-2-build" Mar 16 00:26:51 crc kubenswrapper[4816]: I0316 00:26:51.377317 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-fs5z5-pull\" (UniqueName: \"kubernetes.io/secret/f1394889-b25e-4a90-ad3b-651e20e8ad20-builder-dockercfg-fs5z5-pull\") pod \"sg-core-2-build\" (UID: \"f1394889-b25e-4a90-ad3b-651e20e8ad20\") " pod="service-telemetry/sg-core-2-build" Mar 16 00:26:51 crc kubenswrapper[4816]: I0316 00:26:51.377354 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/f1394889-b25e-4a90-ad3b-651e20e8ad20-container-storage-run\") pod \"sg-core-2-build\" (UID: \"f1394889-b25e-4a90-ad3b-651e20e8ad20\") " pod="service-telemetry/sg-core-2-build" Mar 16 00:26:51 crc kubenswrapper[4816]: I0316 00:26:51.377382 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/f1394889-b25e-4a90-ad3b-651e20e8ad20-node-pullsecrets\") pod \"sg-core-2-build\" (UID: \"f1394889-b25e-4a90-ad3b-651e20e8ad20\") " pod="service-telemetry/sg-core-2-build" Mar 16 00:26:51 crc kubenswrapper[4816]: I0316 00:26:51.477771 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/f1394889-b25e-4a90-ad3b-651e20e8ad20-build-ca-bundles\") pod \"sg-core-2-build\" (UID: \"f1394889-b25e-4a90-ad3b-651e20e8ad20\") " pod="service-telemetry/sg-core-2-build" Mar 16 00:26:51 crc kubenswrapper[4816]: I0316 00:26:51.477825 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/f1394889-b25e-4a90-ad3b-651e20e8ad20-build-system-configs\") pod \"sg-core-2-build\" (UID: \"f1394889-b25e-4a90-ad3b-651e20e8ad20\") " pod="service-telemetry/sg-core-2-build" Mar 16 00:26:51 crc kubenswrapper[4816]: I0316 00:26:51.477856 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/f1394889-b25e-4a90-ad3b-651e20e8ad20-container-storage-root\") pod \"sg-core-2-build\" (UID: \"f1394889-b25e-4a90-ad3b-651e20e8ad20\") " pod="service-telemetry/sg-core-2-build" Mar 16 00:26:51 crc kubenswrapper[4816]: I0316 00:26:51.477873 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/f1394889-b25e-4a90-ad3b-651e20e8ad20-build-blob-cache\") pod \"sg-core-2-build\" (UID: \"f1394889-b25e-4a90-ad3b-651e20e8ad20\") " pod="service-telemetry/sg-core-2-build" Mar 16 00:26:51 crc kubenswrapper[4816]: I0316 00:26:51.477893 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/f1394889-b25e-4a90-ad3b-651e20e8ad20-buildworkdir\") pod \"sg-core-2-build\" (UID: \"f1394889-b25e-4a90-ad3b-651e20e8ad20\") " pod="service-telemetry/sg-core-2-build" Mar 16 00:26:51 crc kubenswrapper[4816]: I0316 00:26:51.477912 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-fs5z5-push\" (UniqueName: \"kubernetes.io/secret/f1394889-b25e-4a90-ad3b-651e20e8ad20-builder-dockercfg-fs5z5-push\") pod \"sg-core-2-build\" (UID: \"f1394889-b25e-4a90-ad3b-651e20e8ad20\") " pod="service-telemetry/sg-core-2-build" Mar 16 00:26:51 crc kubenswrapper[4816]: I0316 00:26:51.477933 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/f1394889-b25e-4a90-ad3b-651e20e8ad20-build-proxy-ca-bundles\") pod \"sg-core-2-build\" (UID: \"f1394889-b25e-4a90-ad3b-651e20e8ad20\") " pod="service-telemetry/sg-core-2-build" Mar 16 00:26:51 crc kubenswrapper[4816]: I0316 00:26:51.477954 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/f1394889-b25e-4a90-ad3b-651e20e8ad20-buildcachedir\") pod \"sg-core-2-build\" (UID: \"f1394889-b25e-4a90-ad3b-651e20e8ad20\") " pod="service-telemetry/sg-core-2-build" Mar 16 00:26:51 crc kubenswrapper[4816]: I0316 00:26:51.477971 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r6hxr\" (UniqueName: \"kubernetes.io/projected/f1394889-b25e-4a90-ad3b-651e20e8ad20-kube-api-access-r6hxr\") pod \"sg-core-2-build\" (UID: \"f1394889-b25e-4a90-ad3b-651e20e8ad20\") " pod="service-telemetry/sg-core-2-build" Mar 16 00:26:51 crc kubenswrapper[4816]: I0316 00:26:51.477989 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-fs5z5-pull\" (UniqueName: \"kubernetes.io/secret/f1394889-b25e-4a90-ad3b-651e20e8ad20-builder-dockercfg-fs5z5-pull\") pod \"sg-core-2-build\" (UID: \"f1394889-b25e-4a90-ad3b-651e20e8ad20\") " pod="service-telemetry/sg-core-2-build" Mar 16 00:26:51 crc kubenswrapper[4816]: I0316 00:26:51.478008 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/f1394889-b25e-4a90-ad3b-651e20e8ad20-container-storage-run\") pod \"sg-core-2-build\" (UID: \"f1394889-b25e-4a90-ad3b-651e20e8ad20\") " pod="service-telemetry/sg-core-2-build" Mar 16 00:26:51 crc kubenswrapper[4816]: I0316 00:26:51.478027 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/f1394889-b25e-4a90-ad3b-651e20e8ad20-node-pullsecrets\") pod \"sg-core-2-build\" (UID: \"f1394889-b25e-4a90-ad3b-651e20e8ad20\") " pod="service-telemetry/sg-core-2-build" Mar 16 00:26:51 crc kubenswrapper[4816]: I0316 00:26:51.478100 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/f1394889-b25e-4a90-ad3b-651e20e8ad20-node-pullsecrets\") pod \"sg-core-2-build\" (UID: \"f1394889-b25e-4a90-ad3b-651e20e8ad20\") " pod="service-telemetry/sg-core-2-build" Mar 16 00:26:51 crc kubenswrapper[4816]: I0316 00:26:51.478324 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/f1394889-b25e-4a90-ad3b-651e20e8ad20-container-storage-root\") pod \"sg-core-2-build\" (UID: \"f1394889-b25e-4a90-ad3b-651e20e8ad20\") " pod="service-telemetry/sg-core-2-build" Mar 16 00:26:51 crc kubenswrapper[4816]: I0316 00:26:51.478353 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/f1394889-b25e-4a90-ad3b-651e20e8ad20-buildworkdir\") pod \"sg-core-2-build\" (UID: \"f1394889-b25e-4a90-ad3b-651e20e8ad20\") " pod="service-telemetry/sg-core-2-build" Mar 16 00:26:51 crc kubenswrapper[4816]: I0316 00:26:51.478716 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/f1394889-b25e-4a90-ad3b-651e20e8ad20-build-blob-cache\") pod \"sg-core-2-build\" (UID: \"f1394889-b25e-4a90-ad3b-651e20e8ad20\") " pod="service-telemetry/sg-core-2-build" Mar 16 00:26:51 crc kubenswrapper[4816]: I0316 00:26:51.478827 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/f1394889-b25e-4a90-ad3b-651e20e8ad20-buildcachedir\") pod \"sg-core-2-build\" (UID: \"f1394889-b25e-4a90-ad3b-651e20e8ad20\") " pod="service-telemetry/sg-core-2-build" Mar 16 00:26:51 crc kubenswrapper[4816]: I0316 00:26:51.479166 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/f1394889-b25e-4a90-ad3b-651e20e8ad20-build-proxy-ca-bundles\") pod \"sg-core-2-build\" (UID: \"f1394889-b25e-4a90-ad3b-651e20e8ad20\") " pod="service-telemetry/sg-core-2-build" Mar 16 00:26:51 crc kubenswrapper[4816]: I0316 00:26:51.479173 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/f1394889-b25e-4a90-ad3b-651e20e8ad20-container-storage-run\") pod \"sg-core-2-build\" (UID: \"f1394889-b25e-4a90-ad3b-651e20e8ad20\") " pod="service-telemetry/sg-core-2-build" Mar 16 00:26:51 crc kubenswrapper[4816]: I0316 00:26:51.479642 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/f1394889-b25e-4a90-ad3b-651e20e8ad20-build-ca-bundles\") pod \"sg-core-2-build\" (UID: \"f1394889-b25e-4a90-ad3b-651e20e8ad20\") " pod="service-telemetry/sg-core-2-build" Mar 16 00:26:51 crc kubenswrapper[4816]: I0316 00:26:51.479897 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/f1394889-b25e-4a90-ad3b-651e20e8ad20-build-system-configs\") pod \"sg-core-2-build\" (UID: \"f1394889-b25e-4a90-ad3b-651e20e8ad20\") " pod="service-telemetry/sg-core-2-build" Mar 16 00:26:51 crc kubenswrapper[4816]: I0316 00:26:51.482222 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-fs5z5-pull\" (UniqueName: \"kubernetes.io/secret/f1394889-b25e-4a90-ad3b-651e20e8ad20-builder-dockercfg-fs5z5-pull\") pod \"sg-core-2-build\" (UID: \"f1394889-b25e-4a90-ad3b-651e20e8ad20\") " pod="service-telemetry/sg-core-2-build" Mar 16 00:26:51 crc kubenswrapper[4816]: I0316 00:26:51.482236 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-fs5z5-push\" (UniqueName: \"kubernetes.io/secret/f1394889-b25e-4a90-ad3b-651e20e8ad20-builder-dockercfg-fs5z5-push\") pod \"sg-core-2-build\" (UID: \"f1394889-b25e-4a90-ad3b-651e20e8ad20\") " pod="service-telemetry/sg-core-2-build" Mar 16 00:26:51 crc kubenswrapper[4816]: I0316 00:26:51.494924 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r6hxr\" (UniqueName: \"kubernetes.io/projected/f1394889-b25e-4a90-ad3b-651e20e8ad20-kube-api-access-r6hxr\") pod \"sg-core-2-build\" (UID: \"f1394889-b25e-4a90-ad3b-651e20e8ad20\") " pod="service-telemetry/sg-core-2-build" Mar 16 00:26:51 crc kubenswrapper[4816]: I0316 00:26:51.665611 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/sg-core-2-build" Mar 16 00:26:51 crc kubenswrapper[4816]: I0316 00:26:51.675012 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c622ccbe-7da3-4233-905c-bd38932a01ff" path="/var/lib/kubelet/pods/c622ccbe-7da3-4233-905c-bd38932a01ff/volumes" Mar 16 00:26:51 crc kubenswrapper[4816]: I0316 00:26:51.851154 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/sg-core-2-build"] Mar 16 00:26:52 crc kubenswrapper[4816]: I0316 00:26:52.387971 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/sg-core-2-build" event={"ID":"f1394889-b25e-4a90-ad3b-651e20e8ad20","Type":"ContainerStarted","Data":"3cc439cbecc1d9ee2a5bd6390fe4f76ea33e05cf5c2ab078d9693997d08ecf9a"} Mar 16 00:26:52 crc kubenswrapper[4816]: I0316 00:26:52.388024 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/sg-core-2-build" event={"ID":"f1394889-b25e-4a90-ad3b-651e20e8ad20","Type":"ContainerStarted","Data":"ae3c6d8445adc1875e9fd69aeb8761204c220a359e0c05ee64563ec952a146ae"} Mar 16 00:26:53 crc kubenswrapper[4816]: I0316 00:26:53.396096 4816 generic.go:334] "Generic (PLEG): container finished" podID="f1394889-b25e-4a90-ad3b-651e20e8ad20" containerID="3cc439cbecc1d9ee2a5bd6390fe4f76ea33e05cf5c2ab078d9693997d08ecf9a" exitCode=0 Mar 16 00:26:53 crc kubenswrapper[4816]: I0316 00:26:53.396144 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/sg-core-2-build" event={"ID":"f1394889-b25e-4a90-ad3b-651e20e8ad20","Type":"ContainerDied","Data":"3cc439cbecc1d9ee2a5bd6390fe4f76ea33e05cf5c2ab078d9693997d08ecf9a"} Mar 16 00:26:54 crc kubenswrapper[4816]: I0316 00:26:54.405873 4816 generic.go:334] "Generic (PLEG): container finished" podID="f1394889-b25e-4a90-ad3b-651e20e8ad20" containerID="55a81fa5148d34c2655ea1ac667509031445a92bd58b8398676af8c464844e7c" exitCode=0 Mar 16 00:26:54 crc kubenswrapper[4816]: I0316 00:26:54.405921 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/sg-core-2-build" event={"ID":"f1394889-b25e-4a90-ad3b-651e20e8ad20","Type":"ContainerDied","Data":"55a81fa5148d34c2655ea1ac667509031445a92bd58b8398676af8c464844e7c"} Mar 16 00:26:54 crc kubenswrapper[4816]: I0316 00:26:54.449245 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_sg-core-2-build_f1394889-b25e-4a90-ad3b-651e20e8ad20/manage-dockerfile/0.log" Mar 16 00:26:55 crc kubenswrapper[4816]: I0316 00:26:55.418829 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/sg-core-2-build" event={"ID":"f1394889-b25e-4a90-ad3b-651e20e8ad20","Type":"ContainerStarted","Data":"9f92516e42ba33b1d2e8579fa9c2dd369bfb873ad948e6fca41db7e816622c1e"} Mar 16 00:26:55 crc kubenswrapper[4816]: I0316 00:26:55.464216 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/sg-core-2-build" podStartSLOduration=4.464198626 podStartE2EDuration="4.464198626s" podCreationTimestamp="2026-03-16 00:26:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-16 00:26:55.459140565 +0000 UTC m=+1208.555440538" watchObservedRunningTime="2026-03-16 00:26:55.464198626 +0000 UTC m=+1208.560498589" Mar 16 00:27:04 crc kubenswrapper[4816]: I0316 00:27:04.144316 4816 scope.go:117] "RemoveContainer" containerID="4565949d11f1fa384d67b3420395f0c07c9d2ee22190f1a94b2e1bc9e4c10a96" Mar 16 00:27:21 crc kubenswrapper[4816]: E0316 00:27:21.987691 4816 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf1394889_b25e_4a90_ad3b_651e20e8ad20.slice/buildah-buildah3198239443\": RecentStats: unable to find data in memory cache]" Mar 16 00:28:00 crc kubenswrapper[4816]: I0316 00:28:00.133776 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29560348-xvv6w"] Mar 16 00:28:00 crc kubenswrapper[4816]: I0316 00:28:00.135292 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29560348-xvv6w" Mar 16 00:28:00 crc kubenswrapper[4816]: I0316 00:28:00.137457 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-8hc2r" Mar 16 00:28:00 crc kubenswrapper[4816]: I0316 00:28:00.138078 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 16 00:28:00 crc kubenswrapper[4816]: I0316 00:28:00.140763 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 16 00:28:00 crc kubenswrapper[4816]: I0316 00:28:00.141582 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29560348-xvv6w"] Mar 16 00:28:00 crc kubenswrapper[4816]: I0316 00:28:00.190564 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s5fg2\" (UniqueName: \"kubernetes.io/projected/a529fd1f-66e5-4e49-b95a-18c6a8aade4b-kube-api-access-s5fg2\") pod \"auto-csr-approver-29560348-xvv6w\" (UID: \"a529fd1f-66e5-4e49-b95a-18c6a8aade4b\") " pod="openshift-infra/auto-csr-approver-29560348-xvv6w" Mar 16 00:28:00 crc kubenswrapper[4816]: I0316 00:28:00.291214 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s5fg2\" (UniqueName: \"kubernetes.io/projected/a529fd1f-66e5-4e49-b95a-18c6a8aade4b-kube-api-access-s5fg2\") pod \"auto-csr-approver-29560348-xvv6w\" (UID: \"a529fd1f-66e5-4e49-b95a-18c6a8aade4b\") " pod="openshift-infra/auto-csr-approver-29560348-xvv6w" Mar 16 00:28:00 crc kubenswrapper[4816]: I0316 00:28:00.314098 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s5fg2\" (UniqueName: \"kubernetes.io/projected/a529fd1f-66e5-4e49-b95a-18c6a8aade4b-kube-api-access-s5fg2\") pod \"auto-csr-approver-29560348-xvv6w\" (UID: \"a529fd1f-66e5-4e49-b95a-18c6a8aade4b\") " pod="openshift-infra/auto-csr-approver-29560348-xvv6w" Mar 16 00:28:00 crc kubenswrapper[4816]: I0316 00:28:00.450937 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29560348-xvv6w" Mar 16 00:28:00 crc kubenswrapper[4816]: I0316 00:28:00.681310 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29560348-xvv6w"] Mar 16 00:28:00 crc kubenswrapper[4816]: I0316 00:28:00.687805 4816 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 16 00:28:00 crc kubenswrapper[4816]: I0316 00:28:00.853811 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29560348-xvv6w" event={"ID":"a529fd1f-66e5-4e49-b95a-18c6a8aade4b","Type":"ContainerStarted","Data":"c0597947f63d22a5a4d56c7617834f9f25d8bc5ca2d1f02506b343df4dd98c86"} Mar 16 00:28:02 crc kubenswrapper[4816]: I0316 00:28:02.866999 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29560348-xvv6w" event={"ID":"a529fd1f-66e5-4e49-b95a-18c6a8aade4b","Type":"ContainerStarted","Data":"2184a6c7d5ea889f0c49670caabfc30e2cdb52bf2b9beb7864557d83b84bbb54"} Mar 16 00:28:02 crc kubenswrapper[4816]: I0316 00:28:02.881252 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29560348-xvv6w" podStartSLOduration=1.099234435 podStartE2EDuration="2.881237333s" podCreationTimestamp="2026-03-16 00:28:00 +0000 UTC" firstStartedPulling="2026-03-16 00:28:00.687616033 +0000 UTC m=+1273.783915986" lastFinishedPulling="2026-03-16 00:28:02.469618931 +0000 UTC m=+1275.565918884" observedRunningTime="2026-03-16 00:28:02.877239071 +0000 UTC m=+1275.973539024" watchObservedRunningTime="2026-03-16 00:28:02.881237333 +0000 UTC m=+1275.977537286" Mar 16 00:28:03 crc kubenswrapper[4816]: I0316 00:28:03.873303 4816 generic.go:334] "Generic (PLEG): container finished" podID="a529fd1f-66e5-4e49-b95a-18c6a8aade4b" containerID="2184a6c7d5ea889f0c49670caabfc30e2cdb52bf2b9beb7864557d83b84bbb54" exitCode=0 Mar 16 00:28:03 crc kubenswrapper[4816]: I0316 00:28:03.873352 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29560348-xvv6w" event={"ID":"a529fd1f-66e5-4e49-b95a-18c6a8aade4b","Type":"ContainerDied","Data":"2184a6c7d5ea889f0c49670caabfc30e2cdb52bf2b9beb7864557d83b84bbb54"} Mar 16 00:28:05 crc kubenswrapper[4816]: I0316 00:28:05.087824 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29560348-xvv6w" Mar 16 00:28:05 crc kubenswrapper[4816]: I0316 00:28:05.163832 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s5fg2\" (UniqueName: \"kubernetes.io/projected/a529fd1f-66e5-4e49-b95a-18c6a8aade4b-kube-api-access-s5fg2\") pod \"a529fd1f-66e5-4e49-b95a-18c6a8aade4b\" (UID: \"a529fd1f-66e5-4e49-b95a-18c6a8aade4b\") " Mar 16 00:28:05 crc kubenswrapper[4816]: I0316 00:28:05.169300 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a529fd1f-66e5-4e49-b95a-18c6a8aade4b-kube-api-access-s5fg2" (OuterVolumeSpecName: "kube-api-access-s5fg2") pod "a529fd1f-66e5-4e49-b95a-18c6a8aade4b" (UID: "a529fd1f-66e5-4e49-b95a-18c6a8aade4b"). InnerVolumeSpecName "kube-api-access-s5fg2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:28:05 crc kubenswrapper[4816]: I0316 00:28:05.265677 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s5fg2\" (UniqueName: \"kubernetes.io/projected/a529fd1f-66e5-4e49-b95a-18c6a8aade4b-kube-api-access-s5fg2\") on node \"crc\" DevicePath \"\"" Mar 16 00:28:05 crc kubenswrapper[4816]: I0316 00:28:05.887171 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29560348-xvv6w" event={"ID":"a529fd1f-66e5-4e49-b95a-18c6a8aade4b","Type":"ContainerDied","Data":"c0597947f63d22a5a4d56c7617834f9f25d8bc5ca2d1f02506b343df4dd98c86"} Mar 16 00:28:05 crc kubenswrapper[4816]: I0316 00:28:05.887203 4816 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c0597947f63d22a5a4d56c7617834f9f25d8bc5ca2d1f02506b343df4dd98c86" Mar 16 00:28:05 crc kubenswrapper[4816]: I0316 00:28:05.887208 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29560348-xvv6w" Mar 16 00:28:05 crc kubenswrapper[4816]: I0316 00:28:05.937838 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29560342-qq7qg"] Mar 16 00:28:05 crc kubenswrapper[4816]: I0316 00:28:05.943741 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29560342-qq7qg"] Mar 16 00:28:07 crc kubenswrapper[4816]: I0316 00:28:07.680910 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d60f1a00-e9c6-46ff-b5eb-f3c680f04736" path="/var/lib/kubelet/pods/d60f1a00-e9c6-46ff-b5eb-f3c680f04736/volumes" Mar 16 00:28:31 crc kubenswrapper[4816]: I0316 00:28:31.863776 4816 patch_prober.go:28] interesting pod/machine-config-daemon-jrdcz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 16 00:28:31 crc kubenswrapper[4816]: I0316 00:28:31.864256 4816 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jrdcz" podUID="dd08ece2-7636-4966-973a-e96a34b70b53" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 16 00:29:01 crc kubenswrapper[4816]: I0316 00:29:01.863479 4816 patch_prober.go:28] interesting pod/machine-config-daemon-jrdcz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 16 00:29:01 crc kubenswrapper[4816]: I0316 00:29:01.864165 4816 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jrdcz" podUID="dd08ece2-7636-4966-973a-e96a34b70b53" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 16 00:29:04 crc kubenswrapper[4816]: I0316 00:29:04.230406 4816 scope.go:117] "RemoveContainer" containerID="dbd7c0bfa602e132787d7d6d843e255ebdb6acf34354466437ff4e5db80a17a7" Mar 16 00:29:31 crc kubenswrapper[4816]: I0316 00:29:31.863574 4816 patch_prober.go:28] interesting pod/machine-config-daemon-jrdcz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 16 00:29:31 crc kubenswrapper[4816]: I0316 00:29:31.865010 4816 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jrdcz" podUID="dd08ece2-7636-4966-973a-e96a34b70b53" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 16 00:29:31 crc kubenswrapper[4816]: I0316 00:29:31.865117 4816 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-jrdcz" Mar 16 00:29:31 crc kubenswrapper[4816]: I0316 00:29:31.865778 4816 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"4dae7771bcc5c45d3db6bc1014246492c003743ca85668bad7e04528051cc6bc"} pod="openshift-machine-config-operator/machine-config-daemon-jrdcz" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 16 00:29:31 crc kubenswrapper[4816]: I0316 00:29:31.865924 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-jrdcz" podUID="dd08ece2-7636-4966-973a-e96a34b70b53" containerName="machine-config-daemon" containerID="cri-o://4dae7771bcc5c45d3db6bc1014246492c003743ca85668bad7e04528051cc6bc" gracePeriod=600 Mar 16 00:29:32 crc kubenswrapper[4816]: I0316 00:29:32.499073 4816 generic.go:334] "Generic (PLEG): container finished" podID="dd08ece2-7636-4966-973a-e96a34b70b53" containerID="4dae7771bcc5c45d3db6bc1014246492c003743ca85668bad7e04528051cc6bc" exitCode=0 Mar 16 00:29:32 crc kubenswrapper[4816]: I0316 00:29:32.499114 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-jrdcz" event={"ID":"dd08ece2-7636-4966-973a-e96a34b70b53","Type":"ContainerDied","Data":"4dae7771bcc5c45d3db6bc1014246492c003743ca85668bad7e04528051cc6bc"} Mar 16 00:29:32 crc kubenswrapper[4816]: I0316 00:29:32.499147 4816 scope.go:117] "RemoveContainer" containerID="d963d56deb174bcc1b2f530e646e1a1dbd328868a82631422f67c019c313cf52" Mar 16 00:29:33 crc kubenswrapper[4816]: I0316 00:29:33.508180 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-jrdcz" event={"ID":"dd08ece2-7636-4966-973a-e96a34b70b53","Type":"ContainerStarted","Data":"92fd160da980a35a692640a98195800839c4f80b2447586e89c2230217ad0071"} Mar 16 00:29:59 crc kubenswrapper[4816]: I0316 00:29:59.712814 4816 generic.go:334] "Generic (PLEG): container finished" podID="f1394889-b25e-4a90-ad3b-651e20e8ad20" containerID="9f92516e42ba33b1d2e8579fa9c2dd369bfb873ad948e6fca41db7e816622c1e" exitCode=0 Mar 16 00:29:59 crc kubenswrapper[4816]: I0316 00:29:59.712902 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/sg-core-2-build" event={"ID":"f1394889-b25e-4a90-ad3b-651e20e8ad20","Type":"ContainerDied","Data":"9f92516e42ba33b1d2e8579fa9c2dd369bfb873ad948e6fca41db7e816622c1e"} Mar 16 00:30:00 crc kubenswrapper[4816]: I0316 00:30:00.151345 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29560350-6dpp5"] Mar 16 00:30:00 crc kubenswrapper[4816]: E0316 00:30:00.151924 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a529fd1f-66e5-4e49-b95a-18c6a8aade4b" containerName="oc" Mar 16 00:30:00 crc kubenswrapper[4816]: I0316 00:30:00.151940 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="a529fd1f-66e5-4e49-b95a-18c6a8aade4b" containerName="oc" Mar 16 00:30:00 crc kubenswrapper[4816]: I0316 00:30:00.157088 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="a529fd1f-66e5-4e49-b95a-18c6a8aade4b" containerName="oc" Mar 16 00:30:00 crc kubenswrapper[4816]: I0316 00:30:00.158351 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29560350-6dpp5" Mar 16 00:30:00 crc kubenswrapper[4816]: I0316 00:30:00.165925 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 16 00:30:00 crc kubenswrapper[4816]: I0316 00:30:00.166251 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 16 00:30:00 crc kubenswrapper[4816]: I0316 00:30:00.166585 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-8hc2r" Mar 16 00:30:00 crc kubenswrapper[4816]: I0316 00:30:00.169232 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29560350-pcslp"] Mar 16 00:30:00 crc kubenswrapper[4816]: I0316 00:30:00.170475 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29560350-pcslp" Mar 16 00:30:00 crc kubenswrapper[4816]: I0316 00:30:00.173213 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 16 00:30:00 crc kubenswrapper[4816]: I0316 00:30:00.173343 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 16 00:30:00 crc kubenswrapper[4816]: I0316 00:30:00.177732 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29560350-6dpp5"] Mar 16 00:30:00 crc kubenswrapper[4816]: I0316 00:30:00.183662 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29560350-pcslp"] Mar 16 00:30:00 crc kubenswrapper[4816]: I0316 00:30:00.247290 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c6zlg\" (UniqueName: \"kubernetes.io/projected/12bfc435-89c2-4917-9bb6-cc2e9eca440c-kube-api-access-c6zlg\") pod \"auto-csr-approver-29560350-6dpp5\" (UID: \"12bfc435-89c2-4917-9bb6-cc2e9eca440c\") " pod="openshift-infra/auto-csr-approver-29560350-6dpp5" Mar 16 00:30:00 crc kubenswrapper[4816]: I0316 00:30:00.247379 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-64s4v\" (UniqueName: \"kubernetes.io/projected/c27926cb-7a0c-4dff-a823-0c9cfdb9977c-kube-api-access-64s4v\") pod \"collect-profiles-29560350-pcslp\" (UID: \"c27926cb-7a0c-4dff-a823-0c9cfdb9977c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29560350-pcslp" Mar 16 00:30:00 crc kubenswrapper[4816]: I0316 00:30:00.247442 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c27926cb-7a0c-4dff-a823-0c9cfdb9977c-secret-volume\") pod \"collect-profiles-29560350-pcslp\" (UID: \"c27926cb-7a0c-4dff-a823-0c9cfdb9977c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29560350-pcslp" Mar 16 00:30:00 crc kubenswrapper[4816]: I0316 00:30:00.247462 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c27926cb-7a0c-4dff-a823-0c9cfdb9977c-config-volume\") pod \"collect-profiles-29560350-pcslp\" (UID: \"c27926cb-7a0c-4dff-a823-0c9cfdb9977c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29560350-pcslp" Mar 16 00:30:00 crc kubenswrapper[4816]: I0316 00:30:00.349392 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c6zlg\" (UniqueName: \"kubernetes.io/projected/12bfc435-89c2-4917-9bb6-cc2e9eca440c-kube-api-access-c6zlg\") pod \"auto-csr-approver-29560350-6dpp5\" (UID: \"12bfc435-89c2-4917-9bb6-cc2e9eca440c\") " pod="openshift-infra/auto-csr-approver-29560350-6dpp5" Mar 16 00:30:00 crc kubenswrapper[4816]: I0316 00:30:00.349488 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-64s4v\" (UniqueName: \"kubernetes.io/projected/c27926cb-7a0c-4dff-a823-0c9cfdb9977c-kube-api-access-64s4v\") pod \"collect-profiles-29560350-pcslp\" (UID: \"c27926cb-7a0c-4dff-a823-0c9cfdb9977c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29560350-pcslp" Mar 16 00:30:00 crc kubenswrapper[4816]: I0316 00:30:00.349653 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c27926cb-7a0c-4dff-a823-0c9cfdb9977c-secret-volume\") pod \"collect-profiles-29560350-pcslp\" (UID: \"c27926cb-7a0c-4dff-a823-0c9cfdb9977c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29560350-pcslp" Mar 16 00:30:00 crc kubenswrapper[4816]: I0316 00:30:00.349700 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c27926cb-7a0c-4dff-a823-0c9cfdb9977c-config-volume\") pod \"collect-profiles-29560350-pcslp\" (UID: \"c27926cb-7a0c-4dff-a823-0c9cfdb9977c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29560350-pcslp" Mar 16 00:30:00 crc kubenswrapper[4816]: I0316 00:30:00.351855 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c27926cb-7a0c-4dff-a823-0c9cfdb9977c-config-volume\") pod \"collect-profiles-29560350-pcslp\" (UID: \"c27926cb-7a0c-4dff-a823-0c9cfdb9977c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29560350-pcslp" Mar 16 00:30:00 crc kubenswrapper[4816]: I0316 00:30:00.363542 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c27926cb-7a0c-4dff-a823-0c9cfdb9977c-secret-volume\") pod \"collect-profiles-29560350-pcslp\" (UID: \"c27926cb-7a0c-4dff-a823-0c9cfdb9977c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29560350-pcslp" Mar 16 00:30:00 crc kubenswrapper[4816]: I0316 00:30:00.379706 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-64s4v\" (UniqueName: \"kubernetes.io/projected/c27926cb-7a0c-4dff-a823-0c9cfdb9977c-kube-api-access-64s4v\") pod \"collect-profiles-29560350-pcslp\" (UID: \"c27926cb-7a0c-4dff-a823-0c9cfdb9977c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29560350-pcslp" Mar 16 00:30:00 crc kubenswrapper[4816]: I0316 00:30:00.380598 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c6zlg\" (UniqueName: \"kubernetes.io/projected/12bfc435-89c2-4917-9bb6-cc2e9eca440c-kube-api-access-c6zlg\") pod \"auto-csr-approver-29560350-6dpp5\" (UID: \"12bfc435-89c2-4917-9bb6-cc2e9eca440c\") " pod="openshift-infra/auto-csr-approver-29560350-6dpp5" Mar 16 00:30:00 crc kubenswrapper[4816]: I0316 00:30:00.493745 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29560350-6dpp5" Mar 16 00:30:00 crc kubenswrapper[4816]: I0316 00:30:00.507236 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29560350-pcslp" Mar 16 00:30:00 crc kubenswrapper[4816]: I0316 00:30:00.693812 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29560350-6dpp5"] Mar 16 00:30:00 crc kubenswrapper[4816]: I0316 00:30:00.721086 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29560350-6dpp5" event={"ID":"12bfc435-89c2-4917-9bb6-cc2e9eca440c","Type":"ContainerStarted","Data":"49cea63e1c43a10078ab745d499a1cd66311bccc4b9367191a210448cc27ed33"} Mar 16 00:30:00 crc kubenswrapper[4816]: I0316 00:30:00.937680 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/sg-core-2-build" Mar 16 00:30:00 crc kubenswrapper[4816]: I0316 00:30:00.945982 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29560350-pcslp"] Mar 16 00:30:00 crc kubenswrapper[4816]: W0316 00:30:00.950419 4816 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc27926cb_7a0c_4dff_a823_0c9cfdb9977c.slice/crio-c8710f60fe7c9f4ba20c58866354cddd542d4c1c8693fb3b7c4d88443de40c27 WatchSource:0}: Error finding container c8710f60fe7c9f4ba20c58866354cddd542d4c1c8693fb3b7c4d88443de40c27: Status 404 returned error can't find the container with id c8710f60fe7c9f4ba20c58866354cddd542d4c1c8693fb3b7c4d88443de40c27 Mar 16 00:30:00 crc kubenswrapper[4816]: I0316 00:30:00.959714 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r6hxr\" (UniqueName: \"kubernetes.io/projected/f1394889-b25e-4a90-ad3b-651e20e8ad20-kube-api-access-r6hxr\") pod \"f1394889-b25e-4a90-ad3b-651e20e8ad20\" (UID: \"f1394889-b25e-4a90-ad3b-651e20e8ad20\") " Mar 16 00:30:00 crc kubenswrapper[4816]: I0316 00:30:00.959930 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/f1394889-b25e-4a90-ad3b-651e20e8ad20-build-system-configs\") pod \"f1394889-b25e-4a90-ad3b-651e20e8ad20\" (UID: \"f1394889-b25e-4a90-ad3b-651e20e8ad20\") " Mar 16 00:30:00 crc kubenswrapper[4816]: I0316 00:30:00.960071 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-fs5z5-push\" (UniqueName: \"kubernetes.io/secret/f1394889-b25e-4a90-ad3b-651e20e8ad20-builder-dockercfg-fs5z5-push\") pod \"f1394889-b25e-4a90-ad3b-651e20e8ad20\" (UID: \"f1394889-b25e-4a90-ad3b-651e20e8ad20\") " Mar 16 00:30:00 crc kubenswrapper[4816]: I0316 00:30:00.960134 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/f1394889-b25e-4a90-ad3b-651e20e8ad20-build-proxy-ca-bundles\") pod \"f1394889-b25e-4a90-ad3b-651e20e8ad20\" (UID: \"f1394889-b25e-4a90-ad3b-651e20e8ad20\") " Mar 16 00:30:00 crc kubenswrapper[4816]: I0316 00:30:00.960191 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/f1394889-b25e-4a90-ad3b-651e20e8ad20-buildcachedir\") pod \"f1394889-b25e-4a90-ad3b-651e20e8ad20\" (UID: \"f1394889-b25e-4a90-ad3b-651e20e8ad20\") " Mar 16 00:30:00 crc kubenswrapper[4816]: I0316 00:30:00.960304 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/f1394889-b25e-4a90-ad3b-651e20e8ad20-buildworkdir\") pod \"f1394889-b25e-4a90-ad3b-651e20e8ad20\" (UID: \"f1394889-b25e-4a90-ad3b-651e20e8ad20\") " Mar 16 00:30:00 crc kubenswrapper[4816]: I0316 00:30:00.960385 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/f1394889-b25e-4a90-ad3b-651e20e8ad20-build-ca-bundles\") pod \"f1394889-b25e-4a90-ad3b-651e20e8ad20\" (UID: \"f1394889-b25e-4a90-ad3b-651e20e8ad20\") " Mar 16 00:30:00 crc kubenswrapper[4816]: I0316 00:30:00.960456 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-fs5z5-pull\" (UniqueName: \"kubernetes.io/secret/f1394889-b25e-4a90-ad3b-651e20e8ad20-builder-dockercfg-fs5z5-pull\") pod \"f1394889-b25e-4a90-ad3b-651e20e8ad20\" (UID: \"f1394889-b25e-4a90-ad3b-651e20e8ad20\") " Mar 16 00:30:00 crc kubenswrapper[4816]: I0316 00:30:00.960529 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/f1394889-b25e-4a90-ad3b-651e20e8ad20-build-blob-cache\") pod \"f1394889-b25e-4a90-ad3b-651e20e8ad20\" (UID: \"f1394889-b25e-4a90-ad3b-651e20e8ad20\") " Mar 16 00:30:00 crc kubenswrapper[4816]: I0316 00:30:00.960612 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f1394889-b25e-4a90-ad3b-651e20e8ad20-build-system-configs" (OuterVolumeSpecName: "build-system-configs") pod "f1394889-b25e-4a90-ad3b-651e20e8ad20" (UID: "f1394889-b25e-4a90-ad3b-651e20e8ad20"). InnerVolumeSpecName "build-system-configs". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:30:00 crc kubenswrapper[4816]: I0316 00:30:00.960624 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/f1394889-b25e-4a90-ad3b-651e20e8ad20-container-storage-root\") pod \"f1394889-b25e-4a90-ad3b-651e20e8ad20\" (UID: \"f1394889-b25e-4a90-ad3b-651e20e8ad20\") " Mar 16 00:30:00 crc kubenswrapper[4816]: I0316 00:30:00.960721 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/f1394889-b25e-4a90-ad3b-651e20e8ad20-container-storage-run\") pod \"f1394889-b25e-4a90-ad3b-651e20e8ad20\" (UID: \"f1394889-b25e-4a90-ad3b-651e20e8ad20\") " Mar 16 00:30:00 crc kubenswrapper[4816]: I0316 00:30:00.960758 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/f1394889-b25e-4a90-ad3b-651e20e8ad20-node-pullsecrets\") pod \"f1394889-b25e-4a90-ad3b-651e20e8ad20\" (UID: \"f1394889-b25e-4a90-ad3b-651e20e8ad20\") " Mar 16 00:30:00 crc kubenswrapper[4816]: I0316 00:30:00.961180 4816 reconciler_common.go:293] "Volume detached for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/f1394889-b25e-4a90-ad3b-651e20e8ad20-build-system-configs\") on node \"crc\" DevicePath \"\"" Mar 16 00:30:00 crc kubenswrapper[4816]: I0316 00:30:00.961215 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f1394889-b25e-4a90-ad3b-651e20e8ad20-node-pullsecrets" (OuterVolumeSpecName: "node-pullsecrets") pod "f1394889-b25e-4a90-ad3b-651e20e8ad20" (UID: "f1394889-b25e-4a90-ad3b-651e20e8ad20"). InnerVolumeSpecName "node-pullsecrets". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 16 00:30:00 crc kubenswrapper[4816]: I0316 00:30:00.962835 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f1394889-b25e-4a90-ad3b-651e20e8ad20-container-storage-run" (OuterVolumeSpecName: "container-storage-run") pod "f1394889-b25e-4a90-ad3b-651e20e8ad20" (UID: "f1394889-b25e-4a90-ad3b-651e20e8ad20"). InnerVolumeSpecName "container-storage-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 16 00:30:00 crc kubenswrapper[4816]: I0316 00:30:00.963707 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f1394889-b25e-4a90-ad3b-651e20e8ad20-buildcachedir" (OuterVolumeSpecName: "buildcachedir") pod "f1394889-b25e-4a90-ad3b-651e20e8ad20" (UID: "f1394889-b25e-4a90-ad3b-651e20e8ad20"). InnerVolumeSpecName "buildcachedir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 16 00:30:00 crc kubenswrapper[4816]: I0316 00:30:00.966893 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f1394889-b25e-4a90-ad3b-651e20e8ad20-build-ca-bundles" (OuterVolumeSpecName: "build-ca-bundles") pod "f1394889-b25e-4a90-ad3b-651e20e8ad20" (UID: "f1394889-b25e-4a90-ad3b-651e20e8ad20"). InnerVolumeSpecName "build-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:30:00 crc kubenswrapper[4816]: I0316 00:30:00.967757 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f1394889-b25e-4a90-ad3b-651e20e8ad20-builder-dockercfg-fs5z5-push" (OuterVolumeSpecName: "builder-dockercfg-fs5z5-push") pod "f1394889-b25e-4a90-ad3b-651e20e8ad20" (UID: "f1394889-b25e-4a90-ad3b-651e20e8ad20"). InnerVolumeSpecName "builder-dockercfg-fs5z5-push". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 16 00:30:00 crc kubenswrapper[4816]: I0316 00:30:00.968185 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f1394889-b25e-4a90-ad3b-651e20e8ad20-builder-dockercfg-fs5z5-pull" (OuterVolumeSpecName: "builder-dockercfg-fs5z5-pull") pod "f1394889-b25e-4a90-ad3b-651e20e8ad20" (UID: "f1394889-b25e-4a90-ad3b-651e20e8ad20"). InnerVolumeSpecName "builder-dockercfg-fs5z5-pull". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 16 00:30:00 crc kubenswrapper[4816]: I0316 00:30:00.969685 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f1394889-b25e-4a90-ad3b-651e20e8ad20-build-proxy-ca-bundles" (OuterVolumeSpecName: "build-proxy-ca-bundles") pod "f1394889-b25e-4a90-ad3b-651e20e8ad20" (UID: "f1394889-b25e-4a90-ad3b-651e20e8ad20"). InnerVolumeSpecName "build-proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:30:00 crc kubenswrapper[4816]: I0316 00:30:00.972445 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f1394889-b25e-4a90-ad3b-651e20e8ad20-kube-api-access-r6hxr" (OuterVolumeSpecName: "kube-api-access-r6hxr") pod "f1394889-b25e-4a90-ad3b-651e20e8ad20" (UID: "f1394889-b25e-4a90-ad3b-651e20e8ad20"). InnerVolumeSpecName "kube-api-access-r6hxr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:30:00 crc kubenswrapper[4816]: I0316 00:30:00.986880 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f1394889-b25e-4a90-ad3b-651e20e8ad20-buildworkdir" (OuterVolumeSpecName: "buildworkdir") pod "f1394889-b25e-4a90-ad3b-651e20e8ad20" (UID: "f1394889-b25e-4a90-ad3b-651e20e8ad20"). InnerVolumeSpecName "buildworkdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 16 00:30:01 crc kubenswrapper[4816]: I0316 00:30:01.063352 4816 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-fs5z5-push\" (UniqueName: \"kubernetes.io/secret/f1394889-b25e-4a90-ad3b-651e20e8ad20-builder-dockercfg-fs5z5-push\") on node \"crc\" DevicePath \"\"" Mar 16 00:30:01 crc kubenswrapper[4816]: I0316 00:30:01.063393 4816 reconciler_common.go:293] "Volume detached for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/f1394889-b25e-4a90-ad3b-651e20e8ad20-build-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 16 00:30:01 crc kubenswrapper[4816]: I0316 00:30:01.063405 4816 reconciler_common.go:293] "Volume detached for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/f1394889-b25e-4a90-ad3b-651e20e8ad20-buildcachedir\") on node \"crc\" DevicePath \"\"" Mar 16 00:30:01 crc kubenswrapper[4816]: I0316 00:30:01.063425 4816 reconciler_common.go:293] "Volume detached for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/f1394889-b25e-4a90-ad3b-651e20e8ad20-buildworkdir\") on node \"crc\" DevicePath \"\"" Mar 16 00:30:01 crc kubenswrapper[4816]: I0316 00:30:01.063434 4816 reconciler_common.go:293] "Volume detached for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/f1394889-b25e-4a90-ad3b-651e20e8ad20-build-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 16 00:30:01 crc kubenswrapper[4816]: I0316 00:30:01.063442 4816 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-fs5z5-pull\" (UniqueName: \"kubernetes.io/secret/f1394889-b25e-4a90-ad3b-651e20e8ad20-builder-dockercfg-fs5z5-pull\") on node \"crc\" DevicePath \"\"" Mar 16 00:30:01 crc kubenswrapper[4816]: I0316 00:30:01.063504 4816 reconciler_common.go:293] "Volume detached for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/f1394889-b25e-4a90-ad3b-651e20e8ad20-container-storage-run\") on node \"crc\" DevicePath \"\"" Mar 16 00:30:01 crc kubenswrapper[4816]: I0316 00:30:01.063515 4816 reconciler_common.go:293] "Volume detached for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/f1394889-b25e-4a90-ad3b-651e20e8ad20-node-pullsecrets\") on node \"crc\" DevicePath \"\"" Mar 16 00:30:01 crc kubenswrapper[4816]: I0316 00:30:01.063584 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r6hxr\" (UniqueName: \"kubernetes.io/projected/f1394889-b25e-4a90-ad3b-651e20e8ad20-kube-api-access-r6hxr\") on node \"crc\" DevicePath \"\"" Mar 16 00:30:01 crc kubenswrapper[4816]: I0316 00:30:01.276783 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f1394889-b25e-4a90-ad3b-651e20e8ad20-build-blob-cache" (OuterVolumeSpecName: "build-blob-cache") pod "f1394889-b25e-4a90-ad3b-651e20e8ad20" (UID: "f1394889-b25e-4a90-ad3b-651e20e8ad20"). InnerVolumeSpecName "build-blob-cache". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 16 00:30:01 crc kubenswrapper[4816]: I0316 00:30:01.367440 4816 reconciler_common.go:293] "Volume detached for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/f1394889-b25e-4a90-ad3b-651e20e8ad20-build-blob-cache\") on node \"crc\" DevicePath \"\"" Mar 16 00:30:01 crc kubenswrapper[4816]: I0316 00:30:01.730661 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/sg-core-2-build" event={"ID":"f1394889-b25e-4a90-ad3b-651e20e8ad20","Type":"ContainerDied","Data":"ae3c6d8445adc1875e9fd69aeb8761204c220a359e0c05ee64563ec952a146ae"} Mar 16 00:30:01 crc kubenswrapper[4816]: I0316 00:30:01.730716 4816 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ae3c6d8445adc1875e9fd69aeb8761204c220a359e0c05ee64563ec952a146ae" Mar 16 00:30:01 crc kubenswrapper[4816]: I0316 00:30:01.730864 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/sg-core-2-build" Mar 16 00:30:01 crc kubenswrapper[4816]: I0316 00:30:01.732586 4816 generic.go:334] "Generic (PLEG): container finished" podID="c27926cb-7a0c-4dff-a823-0c9cfdb9977c" containerID="d0c63cafc91b5e5581126cae772dc10861d08f21c797798bd26dc16d3fd85d6a" exitCode=0 Mar 16 00:30:01 crc kubenswrapper[4816]: I0316 00:30:01.732633 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29560350-pcslp" event={"ID":"c27926cb-7a0c-4dff-a823-0c9cfdb9977c","Type":"ContainerDied","Data":"d0c63cafc91b5e5581126cae772dc10861d08f21c797798bd26dc16d3fd85d6a"} Mar 16 00:30:01 crc kubenswrapper[4816]: I0316 00:30:01.732660 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29560350-pcslp" event={"ID":"c27926cb-7a0c-4dff-a823-0c9cfdb9977c","Type":"ContainerStarted","Data":"c8710f60fe7c9f4ba20c58866354cddd542d4c1c8693fb3b7c4d88443de40c27"} Mar 16 00:30:02 crc kubenswrapper[4816]: I0316 00:30:02.740276 4816 generic.go:334] "Generic (PLEG): container finished" podID="12bfc435-89c2-4917-9bb6-cc2e9eca440c" containerID="a1355d11ec449f6a9fd6597a935b6361539d556da9968192441a1a7760e23960" exitCode=0 Mar 16 00:30:02 crc kubenswrapper[4816]: I0316 00:30:02.740487 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29560350-6dpp5" event={"ID":"12bfc435-89c2-4917-9bb6-cc2e9eca440c","Type":"ContainerDied","Data":"a1355d11ec449f6a9fd6597a935b6361539d556da9968192441a1a7760e23960"} Mar 16 00:30:02 crc kubenswrapper[4816]: I0316 00:30:02.953696 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29560350-pcslp" Mar 16 00:30:02 crc kubenswrapper[4816]: I0316 00:30:02.988287 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-64s4v\" (UniqueName: \"kubernetes.io/projected/c27926cb-7a0c-4dff-a823-0c9cfdb9977c-kube-api-access-64s4v\") pod \"c27926cb-7a0c-4dff-a823-0c9cfdb9977c\" (UID: \"c27926cb-7a0c-4dff-a823-0c9cfdb9977c\") " Mar 16 00:30:02 crc kubenswrapper[4816]: I0316 00:30:02.988349 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c27926cb-7a0c-4dff-a823-0c9cfdb9977c-config-volume\") pod \"c27926cb-7a0c-4dff-a823-0c9cfdb9977c\" (UID: \"c27926cb-7a0c-4dff-a823-0c9cfdb9977c\") " Mar 16 00:30:02 crc kubenswrapper[4816]: I0316 00:30:02.988405 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c27926cb-7a0c-4dff-a823-0c9cfdb9977c-secret-volume\") pod \"c27926cb-7a0c-4dff-a823-0c9cfdb9977c\" (UID: \"c27926cb-7a0c-4dff-a823-0c9cfdb9977c\") " Mar 16 00:30:02 crc kubenswrapper[4816]: I0316 00:30:02.989127 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c27926cb-7a0c-4dff-a823-0c9cfdb9977c-config-volume" (OuterVolumeSpecName: "config-volume") pod "c27926cb-7a0c-4dff-a823-0c9cfdb9977c" (UID: "c27926cb-7a0c-4dff-a823-0c9cfdb9977c"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:30:02 crc kubenswrapper[4816]: I0316 00:30:02.993780 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c27926cb-7a0c-4dff-a823-0c9cfdb9977c-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "c27926cb-7a0c-4dff-a823-0c9cfdb9977c" (UID: "c27926cb-7a0c-4dff-a823-0c9cfdb9977c"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 16 00:30:02 crc kubenswrapper[4816]: I0316 00:30:02.995446 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c27926cb-7a0c-4dff-a823-0c9cfdb9977c-kube-api-access-64s4v" (OuterVolumeSpecName: "kube-api-access-64s4v") pod "c27926cb-7a0c-4dff-a823-0c9cfdb9977c" (UID: "c27926cb-7a0c-4dff-a823-0c9cfdb9977c"). InnerVolumeSpecName "kube-api-access-64s4v". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:30:03 crc kubenswrapper[4816]: I0316 00:30:03.089240 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-64s4v\" (UniqueName: \"kubernetes.io/projected/c27926cb-7a0c-4dff-a823-0c9cfdb9977c-kube-api-access-64s4v\") on node \"crc\" DevicePath \"\"" Mar 16 00:30:03 crc kubenswrapper[4816]: I0316 00:30:03.089269 4816 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c27926cb-7a0c-4dff-a823-0c9cfdb9977c-config-volume\") on node \"crc\" DevicePath \"\"" Mar 16 00:30:03 crc kubenswrapper[4816]: I0316 00:30:03.089279 4816 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c27926cb-7a0c-4dff-a823-0c9cfdb9977c-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 16 00:30:03 crc kubenswrapper[4816]: I0316 00:30:03.464632 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f1394889-b25e-4a90-ad3b-651e20e8ad20-container-storage-root" (OuterVolumeSpecName: "container-storage-root") pod "f1394889-b25e-4a90-ad3b-651e20e8ad20" (UID: "f1394889-b25e-4a90-ad3b-651e20e8ad20"). InnerVolumeSpecName "container-storage-root". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 16 00:30:03 crc kubenswrapper[4816]: I0316 00:30:03.493842 4816 reconciler_common.go:293] "Volume detached for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/f1394889-b25e-4a90-ad3b-651e20e8ad20-container-storage-root\") on node \"crc\" DevicePath \"\"" Mar 16 00:30:03 crc kubenswrapper[4816]: I0316 00:30:03.750127 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29560350-pcslp" Mar 16 00:30:03 crc kubenswrapper[4816]: I0316 00:30:03.750116 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29560350-pcslp" event={"ID":"c27926cb-7a0c-4dff-a823-0c9cfdb9977c","Type":"ContainerDied","Data":"c8710f60fe7c9f4ba20c58866354cddd542d4c1c8693fb3b7c4d88443de40c27"} Mar 16 00:30:03 crc kubenswrapper[4816]: I0316 00:30:03.750327 4816 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c8710f60fe7c9f4ba20c58866354cddd542d4c1c8693fb3b7c4d88443de40c27" Mar 16 00:30:03 crc kubenswrapper[4816]: I0316 00:30:03.956678 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29560350-6dpp5" Mar 16 00:30:04 crc kubenswrapper[4816]: I0316 00:30:03.998722 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c6zlg\" (UniqueName: \"kubernetes.io/projected/12bfc435-89c2-4917-9bb6-cc2e9eca440c-kube-api-access-c6zlg\") pod \"12bfc435-89c2-4917-9bb6-cc2e9eca440c\" (UID: \"12bfc435-89c2-4917-9bb6-cc2e9eca440c\") " Mar 16 00:30:04 crc kubenswrapper[4816]: I0316 00:30:04.019836 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/12bfc435-89c2-4917-9bb6-cc2e9eca440c-kube-api-access-c6zlg" (OuterVolumeSpecName: "kube-api-access-c6zlg") pod "12bfc435-89c2-4917-9bb6-cc2e9eca440c" (UID: "12bfc435-89c2-4917-9bb6-cc2e9eca440c"). InnerVolumeSpecName "kube-api-access-c6zlg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:30:04 crc kubenswrapper[4816]: I0316 00:30:04.103344 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c6zlg\" (UniqueName: \"kubernetes.io/projected/12bfc435-89c2-4917-9bb6-cc2e9eca440c-kube-api-access-c6zlg\") on node \"crc\" DevicePath \"\"" Mar 16 00:30:04 crc kubenswrapper[4816]: I0316 00:30:04.759587 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29560350-6dpp5" event={"ID":"12bfc435-89c2-4917-9bb6-cc2e9eca440c","Type":"ContainerDied","Data":"49cea63e1c43a10078ab745d499a1cd66311bccc4b9367191a210448cc27ed33"} Mar 16 00:30:04 crc kubenswrapper[4816]: I0316 00:30:04.759634 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29560350-6dpp5" Mar 16 00:30:04 crc kubenswrapper[4816]: I0316 00:30:04.759638 4816 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="49cea63e1c43a10078ab745d499a1cd66311bccc4b9367191a210448cc27ed33" Mar 16 00:30:05 crc kubenswrapper[4816]: I0316 00:30:05.013659 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29560344-qmt9b"] Mar 16 00:30:05 crc kubenswrapper[4816]: I0316 00:30:05.022672 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29560344-qmt9b"] Mar 16 00:30:05 crc kubenswrapper[4816]: I0316 00:30:05.676534 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="add0bb36-9ea9-49c0-9dc6-ba31c3cfcf2d" path="/var/lib/kubelet/pods/add0bb36-9ea9-49c0-9dc6-ba31c3cfcf2d/volumes" Mar 16 00:30:05 crc kubenswrapper[4816]: I0316 00:30:05.807348 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/sg-bridge-1-build"] Mar 16 00:30:05 crc kubenswrapper[4816]: E0316 00:30:05.807611 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f1394889-b25e-4a90-ad3b-651e20e8ad20" containerName="git-clone" Mar 16 00:30:05 crc kubenswrapper[4816]: I0316 00:30:05.807622 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="f1394889-b25e-4a90-ad3b-651e20e8ad20" containerName="git-clone" Mar 16 00:30:05 crc kubenswrapper[4816]: E0316 00:30:05.807636 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f1394889-b25e-4a90-ad3b-651e20e8ad20" containerName="docker-build" Mar 16 00:30:05 crc kubenswrapper[4816]: I0316 00:30:05.807642 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="f1394889-b25e-4a90-ad3b-651e20e8ad20" containerName="docker-build" Mar 16 00:30:05 crc kubenswrapper[4816]: E0316 00:30:05.807652 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f1394889-b25e-4a90-ad3b-651e20e8ad20" containerName="manage-dockerfile" Mar 16 00:30:05 crc kubenswrapper[4816]: I0316 00:30:05.807658 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="f1394889-b25e-4a90-ad3b-651e20e8ad20" containerName="manage-dockerfile" Mar 16 00:30:05 crc kubenswrapper[4816]: E0316 00:30:05.807670 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c27926cb-7a0c-4dff-a823-0c9cfdb9977c" containerName="collect-profiles" Mar 16 00:30:05 crc kubenswrapper[4816]: I0316 00:30:05.807675 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="c27926cb-7a0c-4dff-a823-0c9cfdb9977c" containerName="collect-profiles" Mar 16 00:30:05 crc kubenswrapper[4816]: E0316 00:30:05.807683 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="12bfc435-89c2-4917-9bb6-cc2e9eca440c" containerName="oc" Mar 16 00:30:05 crc kubenswrapper[4816]: I0316 00:30:05.807688 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="12bfc435-89c2-4917-9bb6-cc2e9eca440c" containerName="oc" Mar 16 00:30:05 crc kubenswrapper[4816]: I0316 00:30:05.807784 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="12bfc435-89c2-4917-9bb6-cc2e9eca440c" containerName="oc" Mar 16 00:30:05 crc kubenswrapper[4816]: I0316 00:30:05.807792 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="c27926cb-7a0c-4dff-a823-0c9cfdb9977c" containerName="collect-profiles" Mar 16 00:30:05 crc kubenswrapper[4816]: I0316 00:30:05.807805 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="f1394889-b25e-4a90-ad3b-651e20e8ad20" containerName="docker-build" Mar 16 00:30:05 crc kubenswrapper[4816]: I0316 00:30:05.808357 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/sg-bridge-1-build" Mar 16 00:30:05 crc kubenswrapper[4816]: I0316 00:30:05.810905 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"sg-bridge-1-ca" Mar 16 00:30:05 crc kubenswrapper[4816]: I0316 00:30:05.812827 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"sg-bridge-1-global-ca" Mar 16 00:30:05 crc kubenswrapper[4816]: I0316 00:30:05.813065 4816 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"builder-dockercfg-fs5z5" Mar 16 00:30:05 crc kubenswrapper[4816]: I0316 00:30:05.813305 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"sg-bridge-1-sys-config" Mar 16 00:30:05 crc kubenswrapper[4816]: I0316 00:30:05.827858 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/8a12a3fd-22a7-4cc3-ac53-7463cafa502b-buildcachedir\") pod \"sg-bridge-1-build\" (UID: \"8a12a3fd-22a7-4cc3-ac53-7463cafa502b\") " pod="service-telemetry/sg-bridge-1-build" Mar 16 00:30:05 crc kubenswrapper[4816]: I0316 00:30:05.827923 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/8a12a3fd-22a7-4cc3-ac53-7463cafa502b-build-ca-bundles\") pod \"sg-bridge-1-build\" (UID: \"8a12a3fd-22a7-4cc3-ac53-7463cafa502b\") " pod="service-telemetry/sg-bridge-1-build" Mar 16 00:30:05 crc kubenswrapper[4816]: I0316 00:30:05.827957 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/8a12a3fd-22a7-4cc3-ac53-7463cafa502b-build-blob-cache\") pod \"sg-bridge-1-build\" (UID: \"8a12a3fd-22a7-4cc3-ac53-7463cafa502b\") " pod="service-telemetry/sg-bridge-1-build" Mar 16 00:30:05 crc kubenswrapper[4816]: I0316 00:30:05.827984 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/8a12a3fd-22a7-4cc3-ac53-7463cafa502b-build-system-configs\") pod \"sg-bridge-1-build\" (UID: \"8a12a3fd-22a7-4cc3-ac53-7463cafa502b\") " pod="service-telemetry/sg-bridge-1-build" Mar 16 00:30:05 crc kubenswrapper[4816]: I0316 00:30:05.828067 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/sg-bridge-1-build"] Mar 16 00:30:05 crc kubenswrapper[4816]: I0316 00:30:05.828213 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/8a12a3fd-22a7-4cc3-ac53-7463cafa502b-buildworkdir\") pod \"sg-bridge-1-build\" (UID: \"8a12a3fd-22a7-4cc3-ac53-7463cafa502b\") " pod="service-telemetry/sg-bridge-1-build" Mar 16 00:30:05 crc kubenswrapper[4816]: I0316 00:30:05.828247 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-fs5z5-push\" (UniqueName: \"kubernetes.io/secret/8a12a3fd-22a7-4cc3-ac53-7463cafa502b-builder-dockercfg-fs5z5-push\") pod \"sg-bridge-1-build\" (UID: \"8a12a3fd-22a7-4cc3-ac53-7463cafa502b\") " pod="service-telemetry/sg-bridge-1-build" Mar 16 00:30:05 crc kubenswrapper[4816]: I0316 00:30:05.828275 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/8a12a3fd-22a7-4cc3-ac53-7463cafa502b-container-storage-run\") pod \"sg-bridge-1-build\" (UID: \"8a12a3fd-22a7-4cc3-ac53-7463cafa502b\") " pod="service-telemetry/sg-bridge-1-build" Mar 16 00:30:05 crc kubenswrapper[4816]: I0316 00:30:05.828302 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-fs5z5-pull\" (UniqueName: \"kubernetes.io/secret/8a12a3fd-22a7-4cc3-ac53-7463cafa502b-builder-dockercfg-fs5z5-pull\") pod \"sg-bridge-1-build\" (UID: \"8a12a3fd-22a7-4cc3-ac53-7463cafa502b\") " pod="service-telemetry/sg-bridge-1-build" Mar 16 00:30:05 crc kubenswrapper[4816]: I0316 00:30:05.828389 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f2jdp\" (UniqueName: \"kubernetes.io/projected/8a12a3fd-22a7-4cc3-ac53-7463cafa502b-kube-api-access-f2jdp\") pod \"sg-bridge-1-build\" (UID: \"8a12a3fd-22a7-4cc3-ac53-7463cafa502b\") " pod="service-telemetry/sg-bridge-1-build" Mar 16 00:30:05 crc kubenswrapper[4816]: I0316 00:30:05.828531 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/8a12a3fd-22a7-4cc3-ac53-7463cafa502b-container-storage-root\") pod \"sg-bridge-1-build\" (UID: \"8a12a3fd-22a7-4cc3-ac53-7463cafa502b\") " pod="service-telemetry/sg-bridge-1-build" Mar 16 00:30:05 crc kubenswrapper[4816]: I0316 00:30:05.828649 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/8a12a3fd-22a7-4cc3-ac53-7463cafa502b-build-proxy-ca-bundles\") pod \"sg-bridge-1-build\" (UID: \"8a12a3fd-22a7-4cc3-ac53-7463cafa502b\") " pod="service-telemetry/sg-bridge-1-build" Mar 16 00:30:05 crc kubenswrapper[4816]: I0316 00:30:05.828691 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/8a12a3fd-22a7-4cc3-ac53-7463cafa502b-node-pullsecrets\") pod \"sg-bridge-1-build\" (UID: \"8a12a3fd-22a7-4cc3-ac53-7463cafa502b\") " pod="service-telemetry/sg-bridge-1-build" Mar 16 00:30:05 crc kubenswrapper[4816]: I0316 00:30:05.930138 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/8a12a3fd-22a7-4cc3-ac53-7463cafa502b-build-ca-bundles\") pod \"sg-bridge-1-build\" (UID: \"8a12a3fd-22a7-4cc3-ac53-7463cafa502b\") " pod="service-telemetry/sg-bridge-1-build" Mar 16 00:30:05 crc kubenswrapper[4816]: I0316 00:30:05.930251 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/8a12a3fd-22a7-4cc3-ac53-7463cafa502b-build-blob-cache\") pod \"sg-bridge-1-build\" (UID: \"8a12a3fd-22a7-4cc3-ac53-7463cafa502b\") " pod="service-telemetry/sg-bridge-1-build" Mar 16 00:30:05 crc kubenswrapper[4816]: I0316 00:30:05.930289 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/8a12a3fd-22a7-4cc3-ac53-7463cafa502b-build-system-configs\") pod \"sg-bridge-1-build\" (UID: \"8a12a3fd-22a7-4cc3-ac53-7463cafa502b\") " pod="service-telemetry/sg-bridge-1-build" Mar 16 00:30:05 crc kubenswrapper[4816]: I0316 00:30:05.930325 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/8a12a3fd-22a7-4cc3-ac53-7463cafa502b-buildworkdir\") pod \"sg-bridge-1-build\" (UID: \"8a12a3fd-22a7-4cc3-ac53-7463cafa502b\") " pod="service-telemetry/sg-bridge-1-build" Mar 16 00:30:05 crc kubenswrapper[4816]: I0316 00:30:05.930355 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-fs5z5-push\" (UniqueName: \"kubernetes.io/secret/8a12a3fd-22a7-4cc3-ac53-7463cafa502b-builder-dockercfg-fs5z5-push\") pod \"sg-bridge-1-build\" (UID: \"8a12a3fd-22a7-4cc3-ac53-7463cafa502b\") " pod="service-telemetry/sg-bridge-1-build" Mar 16 00:30:05 crc kubenswrapper[4816]: I0316 00:30:05.930384 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/8a12a3fd-22a7-4cc3-ac53-7463cafa502b-container-storage-run\") pod \"sg-bridge-1-build\" (UID: \"8a12a3fd-22a7-4cc3-ac53-7463cafa502b\") " pod="service-telemetry/sg-bridge-1-build" Mar 16 00:30:05 crc kubenswrapper[4816]: I0316 00:30:05.930419 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-fs5z5-pull\" (UniqueName: \"kubernetes.io/secret/8a12a3fd-22a7-4cc3-ac53-7463cafa502b-builder-dockercfg-fs5z5-pull\") pod \"sg-bridge-1-build\" (UID: \"8a12a3fd-22a7-4cc3-ac53-7463cafa502b\") " pod="service-telemetry/sg-bridge-1-build" Mar 16 00:30:05 crc kubenswrapper[4816]: I0316 00:30:05.930479 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f2jdp\" (UniqueName: \"kubernetes.io/projected/8a12a3fd-22a7-4cc3-ac53-7463cafa502b-kube-api-access-f2jdp\") pod \"sg-bridge-1-build\" (UID: \"8a12a3fd-22a7-4cc3-ac53-7463cafa502b\") " pod="service-telemetry/sg-bridge-1-build" Mar 16 00:30:05 crc kubenswrapper[4816]: I0316 00:30:05.930525 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/8a12a3fd-22a7-4cc3-ac53-7463cafa502b-container-storage-root\") pod \"sg-bridge-1-build\" (UID: \"8a12a3fd-22a7-4cc3-ac53-7463cafa502b\") " pod="service-telemetry/sg-bridge-1-build" Mar 16 00:30:05 crc kubenswrapper[4816]: I0316 00:30:05.930583 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/8a12a3fd-22a7-4cc3-ac53-7463cafa502b-build-proxy-ca-bundles\") pod \"sg-bridge-1-build\" (UID: \"8a12a3fd-22a7-4cc3-ac53-7463cafa502b\") " pod="service-telemetry/sg-bridge-1-build" Mar 16 00:30:05 crc kubenswrapper[4816]: I0316 00:30:05.930624 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/8a12a3fd-22a7-4cc3-ac53-7463cafa502b-node-pullsecrets\") pod \"sg-bridge-1-build\" (UID: \"8a12a3fd-22a7-4cc3-ac53-7463cafa502b\") " pod="service-telemetry/sg-bridge-1-build" Mar 16 00:30:05 crc kubenswrapper[4816]: I0316 00:30:05.930663 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/8a12a3fd-22a7-4cc3-ac53-7463cafa502b-buildcachedir\") pod \"sg-bridge-1-build\" (UID: \"8a12a3fd-22a7-4cc3-ac53-7463cafa502b\") " pod="service-telemetry/sg-bridge-1-build" Mar 16 00:30:05 crc kubenswrapper[4816]: I0316 00:30:05.930763 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/8a12a3fd-22a7-4cc3-ac53-7463cafa502b-buildcachedir\") pod \"sg-bridge-1-build\" (UID: \"8a12a3fd-22a7-4cc3-ac53-7463cafa502b\") " pod="service-telemetry/sg-bridge-1-build" Mar 16 00:30:05 crc kubenswrapper[4816]: I0316 00:30:05.930933 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/8a12a3fd-22a7-4cc3-ac53-7463cafa502b-buildworkdir\") pod \"sg-bridge-1-build\" (UID: \"8a12a3fd-22a7-4cc3-ac53-7463cafa502b\") " pod="service-telemetry/sg-bridge-1-build" Mar 16 00:30:05 crc kubenswrapper[4816]: I0316 00:30:05.930965 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/8a12a3fd-22a7-4cc3-ac53-7463cafa502b-build-blob-cache\") pod \"sg-bridge-1-build\" (UID: \"8a12a3fd-22a7-4cc3-ac53-7463cafa502b\") " pod="service-telemetry/sg-bridge-1-build" Mar 16 00:30:05 crc kubenswrapper[4816]: I0316 00:30:05.931035 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/8a12a3fd-22a7-4cc3-ac53-7463cafa502b-node-pullsecrets\") pod \"sg-bridge-1-build\" (UID: \"8a12a3fd-22a7-4cc3-ac53-7463cafa502b\") " pod="service-telemetry/sg-bridge-1-build" Mar 16 00:30:05 crc kubenswrapper[4816]: I0316 00:30:05.931045 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/8a12a3fd-22a7-4cc3-ac53-7463cafa502b-container-storage-run\") pod \"sg-bridge-1-build\" (UID: \"8a12a3fd-22a7-4cc3-ac53-7463cafa502b\") " pod="service-telemetry/sg-bridge-1-build" Mar 16 00:30:05 crc kubenswrapper[4816]: I0316 00:30:05.931295 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/8a12a3fd-22a7-4cc3-ac53-7463cafa502b-container-storage-root\") pod \"sg-bridge-1-build\" (UID: \"8a12a3fd-22a7-4cc3-ac53-7463cafa502b\") " pod="service-telemetry/sg-bridge-1-build" Mar 16 00:30:05 crc kubenswrapper[4816]: I0316 00:30:05.931508 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/8a12a3fd-22a7-4cc3-ac53-7463cafa502b-build-proxy-ca-bundles\") pod \"sg-bridge-1-build\" (UID: \"8a12a3fd-22a7-4cc3-ac53-7463cafa502b\") " pod="service-telemetry/sg-bridge-1-build" Mar 16 00:30:05 crc kubenswrapper[4816]: I0316 00:30:05.931644 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/8a12a3fd-22a7-4cc3-ac53-7463cafa502b-build-ca-bundles\") pod \"sg-bridge-1-build\" (UID: \"8a12a3fd-22a7-4cc3-ac53-7463cafa502b\") " pod="service-telemetry/sg-bridge-1-build" Mar 16 00:30:05 crc kubenswrapper[4816]: I0316 00:30:05.931881 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/8a12a3fd-22a7-4cc3-ac53-7463cafa502b-build-system-configs\") pod \"sg-bridge-1-build\" (UID: \"8a12a3fd-22a7-4cc3-ac53-7463cafa502b\") " pod="service-telemetry/sg-bridge-1-build" Mar 16 00:30:05 crc kubenswrapper[4816]: I0316 00:30:05.934183 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-fs5z5-push\" (UniqueName: \"kubernetes.io/secret/8a12a3fd-22a7-4cc3-ac53-7463cafa502b-builder-dockercfg-fs5z5-push\") pod \"sg-bridge-1-build\" (UID: \"8a12a3fd-22a7-4cc3-ac53-7463cafa502b\") " pod="service-telemetry/sg-bridge-1-build" Mar 16 00:30:05 crc kubenswrapper[4816]: I0316 00:30:05.937162 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-fs5z5-pull\" (UniqueName: \"kubernetes.io/secret/8a12a3fd-22a7-4cc3-ac53-7463cafa502b-builder-dockercfg-fs5z5-pull\") pod \"sg-bridge-1-build\" (UID: \"8a12a3fd-22a7-4cc3-ac53-7463cafa502b\") " pod="service-telemetry/sg-bridge-1-build" Mar 16 00:30:05 crc kubenswrapper[4816]: I0316 00:30:05.960774 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f2jdp\" (UniqueName: \"kubernetes.io/projected/8a12a3fd-22a7-4cc3-ac53-7463cafa502b-kube-api-access-f2jdp\") pod \"sg-bridge-1-build\" (UID: \"8a12a3fd-22a7-4cc3-ac53-7463cafa502b\") " pod="service-telemetry/sg-bridge-1-build" Mar 16 00:30:06 crc kubenswrapper[4816]: I0316 00:30:06.129038 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/sg-bridge-1-build" Mar 16 00:30:06 crc kubenswrapper[4816]: I0316 00:30:06.381115 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/sg-bridge-1-build"] Mar 16 00:30:06 crc kubenswrapper[4816]: W0316 00:30:06.387346 4816 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8a12a3fd_22a7_4cc3_ac53_7463cafa502b.slice/crio-8a107ef141b4ece40f15dcdf9c4c87527a33a8b3c6765c1fce2f47803bc47549 WatchSource:0}: Error finding container 8a107ef141b4ece40f15dcdf9c4c87527a33a8b3c6765c1fce2f47803bc47549: Status 404 returned error can't find the container with id 8a107ef141b4ece40f15dcdf9c4c87527a33a8b3c6765c1fce2f47803bc47549 Mar 16 00:30:06 crc kubenswrapper[4816]: I0316 00:30:06.771975 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/sg-bridge-1-build" event={"ID":"8a12a3fd-22a7-4cc3-ac53-7463cafa502b","Type":"ContainerStarted","Data":"8a107ef141b4ece40f15dcdf9c4c87527a33a8b3c6765c1fce2f47803bc47549"} Mar 16 00:30:07 crc kubenswrapper[4816]: I0316 00:30:07.783716 4816 generic.go:334] "Generic (PLEG): container finished" podID="8a12a3fd-22a7-4cc3-ac53-7463cafa502b" containerID="dbea9dcde116242b7d23230041fd331ed37ce34d53b83f1909398b83e6d4d7ee" exitCode=0 Mar 16 00:30:07 crc kubenswrapper[4816]: I0316 00:30:07.783869 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/sg-bridge-1-build" event={"ID":"8a12a3fd-22a7-4cc3-ac53-7463cafa502b","Type":"ContainerDied","Data":"dbea9dcde116242b7d23230041fd331ed37ce34d53b83f1909398b83e6d4d7ee"} Mar 16 00:30:08 crc kubenswrapper[4816]: I0316 00:30:08.794256 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/sg-bridge-1-build" event={"ID":"8a12a3fd-22a7-4cc3-ac53-7463cafa502b","Type":"ContainerStarted","Data":"4b795c4bb6bb53330ab38747d1f1ac7c12b7bd727854ad62563f6bcf284e3e15"} Mar 16 00:30:08 crc kubenswrapper[4816]: I0316 00:30:08.822142 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/sg-bridge-1-build" podStartSLOduration=3.822117102 podStartE2EDuration="3.822117102s" podCreationTimestamp="2026-03-16 00:30:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-16 00:30:08.81894139 +0000 UTC m=+1401.915241363" watchObservedRunningTime="2026-03-16 00:30:08.822117102 +0000 UTC m=+1401.918417075" Mar 16 00:30:15 crc kubenswrapper[4816]: I0316 00:30:15.622452 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["service-telemetry/sg-bridge-1-build"] Mar 16 00:30:15 crc kubenswrapper[4816]: I0316 00:30:15.623629 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="service-telemetry/sg-bridge-1-build" podUID="8a12a3fd-22a7-4cc3-ac53-7463cafa502b" containerName="docker-build" containerID="cri-o://4b795c4bb6bb53330ab38747d1f1ac7c12b7bd727854ad62563f6bcf284e3e15" gracePeriod=30 Mar 16 00:30:15 crc kubenswrapper[4816]: I0316 00:30:15.863279 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_sg-bridge-1-build_8a12a3fd-22a7-4cc3-ac53-7463cafa502b/docker-build/0.log" Mar 16 00:30:15 crc kubenswrapper[4816]: I0316 00:30:15.867123 4816 generic.go:334] "Generic (PLEG): container finished" podID="8a12a3fd-22a7-4cc3-ac53-7463cafa502b" containerID="4b795c4bb6bb53330ab38747d1f1ac7c12b7bd727854ad62563f6bcf284e3e15" exitCode=1 Mar 16 00:30:15 crc kubenswrapper[4816]: I0316 00:30:15.867175 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/sg-bridge-1-build" event={"ID":"8a12a3fd-22a7-4cc3-ac53-7463cafa502b","Type":"ContainerDied","Data":"4b795c4bb6bb53330ab38747d1f1ac7c12b7bd727854ad62563f6bcf284e3e15"} Mar 16 00:30:16 crc kubenswrapper[4816]: I0316 00:30:16.061077 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_sg-bridge-1-build_8a12a3fd-22a7-4cc3-ac53-7463cafa502b/docker-build/0.log" Mar 16 00:30:16 crc kubenswrapper[4816]: I0316 00:30:16.061395 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/sg-bridge-1-build" Mar 16 00:30:16 crc kubenswrapper[4816]: I0316 00:30:16.096168 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/8a12a3fd-22a7-4cc3-ac53-7463cafa502b-build-blob-cache\") pod \"8a12a3fd-22a7-4cc3-ac53-7463cafa502b\" (UID: \"8a12a3fd-22a7-4cc3-ac53-7463cafa502b\") " Mar 16 00:30:16 crc kubenswrapper[4816]: I0316 00:30:16.096242 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/8a12a3fd-22a7-4cc3-ac53-7463cafa502b-container-storage-root\") pod \"8a12a3fd-22a7-4cc3-ac53-7463cafa502b\" (UID: \"8a12a3fd-22a7-4cc3-ac53-7463cafa502b\") " Mar 16 00:30:16 crc kubenswrapper[4816]: I0316 00:30:16.096321 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f2jdp\" (UniqueName: \"kubernetes.io/projected/8a12a3fd-22a7-4cc3-ac53-7463cafa502b-kube-api-access-f2jdp\") pod \"8a12a3fd-22a7-4cc3-ac53-7463cafa502b\" (UID: \"8a12a3fd-22a7-4cc3-ac53-7463cafa502b\") " Mar 16 00:30:16 crc kubenswrapper[4816]: I0316 00:30:16.096475 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-fs5z5-pull\" (UniqueName: \"kubernetes.io/secret/8a12a3fd-22a7-4cc3-ac53-7463cafa502b-builder-dockercfg-fs5z5-pull\") pod \"8a12a3fd-22a7-4cc3-ac53-7463cafa502b\" (UID: \"8a12a3fd-22a7-4cc3-ac53-7463cafa502b\") " Mar 16 00:30:16 crc kubenswrapper[4816]: I0316 00:30:16.098659 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/8a12a3fd-22a7-4cc3-ac53-7463cafa502b-build-system-configs\") pod \"8a12a3fd-22a7-4cc3-ac53-7463cafa502b\" (UID: \"8a12a3fd-22a7-4cc3-ac53-7463cafa502b\") " Mar 16 00:30:16 crc kubenswrapper[4816]: I0316 00:30:16.098721 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/8a12a3fd-22a7-4cc3-ac53-7463cafa502b-build-ca-bundles\") pod \"8a12a3fd-22a7-4cc3-ac53-7463cafa502b\" (UID: \"8a12a3fd-22a7-4cc3-ac53-7463cafa502b\") " Mar 16 00:30:16 crc kubenswrapper[4816]: I0316 00:30:16.098745 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/8a12a3fd-22a7-4cc3-ac53-7463cafa502b-node-pullsecrets\") pod \"8a12a3fd-22a7-4cc3-ac53-7463cafa502b\" (UID: \"8a12a3fd-22a7-4cc3-ac53-7463cafa502b\") " Mar 16 00:30:16 crc kubenswrapper[4816]: I0316 00:30:16.098768 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/8a12a3fd-22a7-4cc3-ac53-7463cafa502b-buildworkdir\") pod \"8a12a3fd-22a7-4cc3-ac53-7463cafa502b\" (UID: \"8a12a3fd-22a7-4cc3-ac53-7463cafa502b\") " Mar 16 00:30:16 crc kubenswrapper[4816]: I0316 00:30:16.098802 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-fs5z5-push\" (UniqueName: \"kubernetes.io/secret/8a12a3fd-22a7-4cc3-ac53-7463cafa502b-builder-dockercfg-fs5z5-push\") pod \"8a12a3fd-22a7-4cc3-ac53-7463cafa502b\" (UID: \"8a12a3fd-22a7-4cc3-ac53-7463cafa502b\") " Mar 16 00:30:16 crc kubenswrapper[4816]: I0316 00:30:16.098823 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/8a12a3fd-22a7-4cc3-ac53-7463cafa502b-container-storage-run\") pod \"8a12a3fd-22a7-4cc3-ac53-7463cafa502b\" (UID: \"8a12a3fd-22a7-4cc3-ac53-7463cafa502b\") " Mar 16 00:30:16 crc kubenswrapper[4816]: I0316 00:30:16.098839 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/8a12a3fd-22a7-4cc3-ac53-7463cafa502b-build-proxy-ca-bundles\") pod \"8a12a3fd-22a7-4cc3-ac53-7463cafa502b\" (UID: \"8a12a3fd-22a7-4cc3-ac53-7463cafa502b\") " Mar 16 00:30:16 crc kubenswrapper[4816]: I0316 00:30:16.098855 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/8a12a3fd-22a7-4cc3-ac53-7463cafa502b-buildcachedir\") pod \"8a12a3fd-22a7-4cc3-ac53-7463cafa502b\" (UID: \"8a12a3fd-22a7-4cc3-ac53-7463cafa502b\") " Mar 16 00:30:16 crc kubenswrapper[4816]: I0316 00:30:16.099244 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8a12a3fd-22a7-4cc3-ac53-7463cafa502b-buildcachedir" (OuterVolumeSpecName: "buildcachedir") pod "8a12a3fd-22a7-4cc3-ac53-7463cafa502b" (UID: "8a12a3fd-22a7-4cc3-ac53-7463cafa502b"). InnerVolumeSpecName "buildcachedir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 16 00:30:16 crc kubenswrapper[4816]: I0316 00:30:16.100140 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8a12a3fd-22a7-4cc3-ac53-7463cafa502b-build-proxy-ca-bundles" (OuterVolumeSpecName: "build-proxy-ca-bundles") pod "8a12a3fd-22a7-4cc3-ac53-7463cafa502b" (UID: "8a12a3fd-22a7-4cc3-ac53-7463cafa502b"). InnerVolumeSpecName "build-proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:30:16 crc kubenswrapper[4816]: I0316 00:30:16.100245 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8a12a3fd-22a7-4cc3-ac53-7463cafa502b-node-pullsecrets" (OuterVolumeSpecName: "node-pullsecrets") pod "8a12a3fd-22a7-4cc3-ac53-7463cafa502b" (UID: "8a12a3fd-22a7-4cc3-ac53-7463cafa502b"). InnerVolumeSpecName "node-pullsecrets". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 16 00:30:16 crc kubenswrapper[4816]: I0316 00:30:16.100416 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8a12a3fd-22a7-4cc3-ac53-7463cafa502b-build-ca-bundles" (OuterVolumeSpecName: "build-ca-bundles") pod "8a12a3fd-22a7-4cc3-ac53-7463cafa502b" (UID: "8a12a3fd-22a7-4cc3-ac53-7463cafa502b"). InnerVolumeSpecName "build-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:30:16 crc kubenswrapper[4816]: I0316 00:30:16.100913 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8a12a3fd-22a7-4cc3-ac53-7463cafa502b-container-storage-run" (OuterVolumeSpecName: "container-storage-run") pod "8a12a3fd-22a7-4cc3-ac53-7463cafa502b" (UID: "8a12a3fd-22a7-4cc3-ac53-7463cafa502b"). InnerVolumeSpecName "container-storage-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 16 00:30:16 crc kubenswrapper[4816]: I0316 00:30:16.101174 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8a12a3fd-22a7-4cc3-ac53-7463cafa502b-build-system-configs" (OuterVolumeSpecName: "build-system-configs") pod "8a12a3fd-22a7-4cc3-ac53-7463cafa502b" (UID: "8a12a3fd-22a7-4cc3-ac53-7463cafa502b"). InnerVolumeSpecName "build-system-configs". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:30:16 crc kubenswrapper[4816]: I0316 00:30:16.101237 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8a12a3fd-22a7-4cc3-ac53-7463cafa502b-buildworkdir" (OuterVolumeSpecName: "buildworkdir") pod "8a12a3fd-22a7-4cc3-ac53-7463cafa502b" (UID: "8a12a3fd-22a7-4cc3-ac53-7463cafa502b"). InnerVolumeSpecName "buildworkdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 16 00:30:16 crc kubenswrapper[4816]: I0316 00:30:16.102360 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8a12a3fd-22a7-4cc3-ac53-7463cafa502b-builder-dockercfg-fs5z5-pull" (OuterVolumeSpecName: "builder-dockercfg-fs5z5-pull") pod "8a12a3fd-22a7-4cc3-ac53-7463cafa502b" (UID: "8a12a3fd-22a7-4cc3-ac53-7463cafa502b"). InnerVolumeSpecName "builder-dockercfg-fs5z5-pull". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 16 00:30:16 crc kubenswrapper[4816]: I0316 00:30:16.102721 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8a12a3fd-22a7-4cc3-ac53-7463cafa502b-kube-api-access-f2jdp" (OuterVolumeSpecName: "kube-api-access-f2jdp") pod "8a12a3fd-22a7-4cc3-ac53-7463cafa502b" (UID: "8a12a3fd-22a7-4cc3-ac53-7463cafa502b"). InnerVolumeSpecName "kube-api-access-f2jdp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:30:16 crc kubenswrapper[4816]: I0316 00:30:16.103099 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8a12a3fd-22a7-4cc3-ac53-7463cafa502b-builder-dockercfg-fs5z5-push" (OuterVolumeSpecName: "builder-dockercfg-fs5z5-push") pod "8a12a3fd-22a7-4cc3-ac53-7463cafa502b" (UID: "8a12a3fd-22a7-4cc3-ac53-7463cafa502b"). InnerVolumeSpecName "builder-dockercfg-fs5z5-push". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 16 00:30:16 crc kubenswrapper[4816]: I0316 00:30:16.173219 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8a12a3fd-22a7-4cc3-ac53-7463cafa502b-build-blob-cache" (OuterVolumeSpecName: "build-blob-cache") pod "8a12a3fd-22a7-4cc3-ac53-7463cafa502b" (UID: "8a12a3fd-22a7-4cc3-ac53-7463cafa502b"). InnerVolumeSpecName "build-blob-cache". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 16 00:30:16 crc kubenswrapper[4816]: I0316 00:30:16.200411 4816 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-fs5z5-pull\" (UniqueName: \"kubernetes.io/secret/8a12a3fd-22a7-4cc3-ac53-7463cafa502b-builder-dockercfg-fs5z5-pull\") on node \"crc\" DevicePath \"\"" Mar 16 00:30:16 crc kubenswrapper[4816]: I0316 00:30:16.200458 4816 reconciler_common.go:293] "Volume detached for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/8a12a3fd-22a7-4cc3-ac53-7463cafa502b-build-system-configs\") on node \"crc\" DevicePath \"\"" Mar 16 00:30:16 crc kubenswrapper[4816]: I0316 00:30:16.200473 4816 reconciler_common.go:293] "Volume detached for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/8a12a3fd-22a7-4cc3-ac53-7463cafa502b-build-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 16 00:30:16 crc kubenswrapper[4816]: I0316 00:30:16.200486 4816 reconciler_common.go:293] "Volume detached for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/8a12a3fd-22a7-4cc3-ac53-7463cafa502b-node-pullsecrets\") on node \"crc\" DevicePath \"\"" Mar 16 00:30:16 crc kubenswrapper[4816]: I0316 00:30:16.200501 4816 reconciler_common.go:293] "Volume detached for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/8a12a3fd-22a7-4cc3-ac53-7463cafa502b-buildworkdir\") on node \"crc\" DevicePath \"\"" Mar 16 00:30:16 crc kubenswrapper[4816]: I0316 00:30:16.200541 4816 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-fs5z5-push\" (UniqueName: \"kubernetes.io/secret/8a12a3fd-22a7-4cc3-ac53-7463cafa502b-builder-dockercfg-fs5z5-push\") on node \"crc\" DevicePath \"\"" Mar 16 00:30:16 crc kubenswrapper[4816]: I0316 00:30:16.200572 4816 reconciler_common.go:293] "Volume detached for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/8a12a3fd-22a7-4cc3-ac53-7463cafa502b-container-storage-run\") on node \"crc\" DevicePath \"\"" Mar 16 00:30:16 crc kubenswrapper[4816]: I0316 00:30:16.200585 4816 reconciler_common.go:293] "Volume detached for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/8a12a3fd-22a7-4cc3-ac53-7463cafa502b-build-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 16 00:30:16 crc kubenswrapper[4816]: I0316 00:30:16.200598 4816 reconciler_common.go:293] "Volume detached for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/8a12a3fd-22a7-4cc3-ac53-7463cafa502b-buildcachedir\") on node \"crc\" DevicePath \"\"" Mar 16 00:30:16 crc kubenswrapper[4816]: I0316 00:30:16.200610 4816 reconciler_common.go:293] "Volume detached for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/8a12a3fd-22a7-4cc3-ac53-7463cafa502b-build-blob-cache\") on node \"crc\" DevicePath \"\"" Mar 16 00:30:16 crc kubenswrapper[4816]: I0316 00:30:16.200622 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f2jdp\" (UniqueName: \"kubernetes.io/projected/8a12a3fd-22a7-4cc3-ac53-7463cafa502b-kube-api-access-f2jdp\") on node \"crc\" DevicePath \"\"" Mar 16 00:30:16 crc kubenswrapper[4816]: I0316 00:30:16.511236 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8a12a3fd-22a7-4cc3-ac53-7463cafa502b-container-storage-root" (OuterVolumeSpecName: "container-storage-root") pod "8a12a3fd-22a7-4cc3-ac53-7463cafa502b" (UID: "8a12a3fd-22a7-4cc3-ac53-7463cafa502b"). InnerVolumeSpecName "container-storage-root". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 16 00:30:16 crc kubenswrapper[4816]: I0316 00:30:16.512197 4816 reconciler_common.go:293] "Volume detached for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/8a12a3fd-22a7-4cc3-ac53-7463cafa502b-container-storage-root\") on node \"crc\" DevicePath \"\"" Mar 16 00:30:16 crc kubenswrapper[4816]: I0316 00:30:16.877319 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_sg-bridge-1-build_8a12a3fd-22a7-4cc3-ac53-7463cafa502b/docker-build/0.log" Mar 16 00:30:16 crc kubenswrapper[4816]: I0316 00:30:16.877991 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/sg-bridge-1-build" event={"ID":"8a12a3fd-22a7-4cc3-ac53-7463cafa502b","Type":"ContainerDied","Data":"8a107ef141b4ece40f15dcdf9c4c87527a33a8b3c6765c1fce2f47803bc47549"} Mar 16 00:30:16 crc kubenswrapper[4816]: I0316 00:30:16.878045 4816 scope.go:117] "RemoveContainer" containerID="4b795c4bb6bb53330ab38747d1f1ac7c12b7bd727854ad62563f6bcf284e3e15" Mar 16 00:30:16 crc kubenswrapper[4816]: I0316 00:30:16.878164 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/sg-bridge-1-build" Mar 16 00:30:16 crc kubenswrapper[4816]: I0316 00:30:16.908644 4816 scope.go:117] "RemoveContainer" containerID="dbea9dcde116242b7d23230041fd331ed37ce34d53b83f1909398b83e6d4d7ee" Mar 16 00:30:16 crc kubenswrapper[4816]: I0316 00:30:16.928806 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["service-telemetry/sg-bridge-1-build"] Mar 16 00:30:16 crc kubenswrapper[4816]: I0316 00:30:16.952089 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["service-telemetry/sg-bridge-1-build"] Mar 16 00:30:17 crc kubenswrapper[4816]: I0316 00:30:17.281317 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/sg-bridge-2-build"] Mar 16 00:30:17 crc kubenswrapper[4816]: E0316 00:30:17.298499 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8a12a3fd-22a7-4cc3-ac53-7463cafa502b" containerName="docker-build" Mar 16 00:30:17 crc kubenswrapper[4816]: I0316 00:30:17.298788 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="8a12a3fd-22a7-4cc3-ac53-7463cafa502b" containerName="docker-build" Mar 16 00:30:17 crc kubenswrapper[4816]: E0316 00:30:17.298908 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8a12a3fd-22a7-4cc3-ac53-7463cafa502b" containerName="manage-dockerfile" Mar 16 00:30:17 crc kubenswrapper[4816]: I0316 00:30:17.299009 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="8a12a3fd-22a7-4cc3-ac53-7463cafa502b" containerName="manage-dockerfile" Mar 16 00:30:17 crc kubenswrapper[4816]: I0316 00:30:17.299920 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="8a12a3fd-22a7-4cc3-ac53-7463cafa502b" containerName="docker-build" Mar 16 00:30:17 crc kubenswrapper[4816]: I0316 00:30:17.301948 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/sg-bridge-2-build" Mar 16 00:30:17 crc kubenswrapper[4816]: I0316 00:30:17.312808 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"sg-bridge-2-global-ca" Mar 16 00:30:17 crc kubenswrapper[4816]: I0316 00:30:17.314178 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"sg-bridge-2-sys-config" Mar 16 00:30:17 crc kubenswrapper[4816]: I0316 00:30:17.314592 4816 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"builder-dockercfg-fs5z5" Mar 16 00:30:17 crc kubenswrapper[4816]: I0316 00:30:17.314975 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"sg-bridge-2-ca" Mar 16 00:30:17 crc kubenswrapper[4816]: I0316 00:30:17.315492 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/sg-bridge-2-build"] Mar 16 00:30:17 crc kubenswrapper[4816]: I0316 00:30:17.325256 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-fs5z5-pull\" (UniqueName: \"kubernetes.io/secret/e28d6969-ebed-4cf6-bb79-47e69bd952b9-builder-dockercfg-fs5z5-pull\") pod \"sg-bridge-2-build\" (UID: \"e28d6969-ebed-4cf6-bb79-47e69bd952b9\") " pod="service-telemetry/sg-bridge-2-build" Mar 16 00:30:17 crc kubenswrapper[4816]: I0316 00:30:17.325320 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/e28d6969-ebed-4cf6-bb79-47e69bd952b9-build-blob-cache\") pod \"sg-bridge-2-build\" (UID: \"e28d6969-ebed-4cf6-bb79-47e69bd952b9\") " pod="service-telemetry/sg-bridge-2-build" Mar 16 00:30:17 crc kubenswrapper[4816]: I0316 00:30:17.325359 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-fs5z5-push\" (UniqueName: \"kubernetes.io/secret/e28d6969-ebed-4cf6-bb79-47e69bd952b9-builder-dockercfg-fs5z5-push\") pod \"sg-bridge-2-build\" (UID: \"e28d6969-ebed-4cf6-bb79-47e69bd952b9\") " pod="service-telemetry/sg-bridge-2-build" Mar 16 00:30:17 crc kubenswrapper[4816]: I0316 00:30:17.325390 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/e28d6969-ebed-4cf6-bb79-47e69bd952b9-node-pullsecrets\") pod \"sg-bridge-2-build\" (UID: \"e28d6969-ebed-4cf6-bb79-47e69bd952b9\") " pod="service-telemetry/sg-bridge-2-build" Mar 16 00:30:17 crc kubenswrapper[4816]: I0316 00:30:17.325425 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/e28d6969-ebed-4cf6-bb79-47e69bd952b9-buildworkdir\") pod \"sg-bridge-2-build\" (UID: \"e28d6969-ebed-4cf6-bb79-47e69bd952b9\") " pod="service-telemetry/sg-bridge-2-build" Mar 16 00:30:17 crc kubenswrapper[4816]: I0316 00:30:17.325447 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/e28d6969-ebed-4cf6-bb79-47e69bd952b9-container-storage-run\") pod \"sg-bridge-2-build\" (UID: \"e28d6969-ebed-4cf6-bb79-47e69bd952b9\") " pod="service-telemetry/sg-bridge-2-build" Mar 16 00:30:17 crc kubenswrapper[4816]: I0316 00:30:17.325468 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xvt9g\" (UniqueName: \"kubernetes.io/projected/e28d6969-ebed-4cf6-bb79-47e69bd952b9-kube-api-access-xvt9g\") pod \"sg-bridge-2-build\" (UID: \"e28d6969-ebed-4cf6-bb79-47e69bd952b9\") " pod="service-telemetry/sg-bridge-2-build" Mar 16 00:30:17 crc kubenswrapper[4816]: I0316 00:30:17.325496 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/e28d6969-ebed-4cf6-bb79-47e69bd952b9-buildcachedir\") pod \"sg-bridge-2-build\" (UID: \"e28d6969-ebed-4cf6-bb79-47e69bd952b9\") " pod="service-telemetry/sg-bridge-2-build" Mar 16 00:30:17 crc kubenswrapper[4816]: I0316 00:30:17.325526 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e28d6969-ebed-4cf6-bb79-47e69bd952b9-build-ca-bundles\") pod \"sg-bridge-2-build\" (UID: \"e28d6969-ebed-4cf6-bb79-47e69bd952b9\") " pod="service-telemetry/sg-bridge-2-build" Mar 16 00:30:17 crc kubenswrapper[4816]: I0316 00:30:17.325582 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/e28d6969-ebed-4cf6-bb79-47e69bd952b9-build-system-configs\") pod \"sg-bridge-2-build\" (UID: \"e28d6969-ebed-4cf6-bb79-47e69bd952b9\") " pod="service-telemetry/sg-bridge-2-build" Mar 16 00:30:17 crc kubenswrapper[4816]: I0316 00:30:17.325605 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/e28d6969-ebed-4cf6-bb79-47e69bd952b9-container-storage-root\") pod \"sg-bridge-2-build\" (UID: \"e28d6969-ebed-4cf6-bb79-47e69bd952b9\") " pod="service-telemetry/sg-bridge-2-build" Mar 16 00:30:17 crc kubenswrapper[4816]: I0316 00:30:17.325643 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e28d6969-ebed-4cf6-bb79-47e69bd952b9-build-proxy-ca-bundles\") pod \"sg-bridge-2-build\" (UID: \"e28d6969-ebed-4cf6-bb79-47e69bd952b9\") " pod="service-telemetry/sg-bridge-2-build" Mar 16 00:30:17 crc kubenswrapper[4816]: I0316 00:30:17.427400 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-fs5z5-pull\" (UniqueName: \"kubernetes.io/secret/e28d6969-ebed-4cf6-bb79-47e69bd952b9-builder-dockercfg-fs5z5-pull\") pod \"sg-bridge-2-build\" (UID: \"e28d6969-ebed-4cf6-bb79-47e69bd952b9\") " pod="service-telemetry/sg-bridge-2-build" Mar 16 00:30:17 crc kubenswrapper[4816]: I0316 00:30:17.427901 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/e28d6969-ebed-4cf6-bb79-47e69bd952b9-build-blob-cache\") pod \"sg-bridge-2-build\" (UID: \"e28d6969-ebed-4cf6-bb79-47e69bd952b9\") " pod="service-telemetry/sg-bridge-2-build" Mar 16 00:30:17 crc kubenswrapper[4816]: I0316 00:30:17.428143 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-fs5z5-push\" (UniqueName: \"kubernetes.io/secret/e28d6969-ebed-4cf6-bb79-47e69bd952b9-builder-dockercfg-fs5z5-push\") pod \"sg-bridge-2-build\" (UID: \"e28d6969-ebed-4cf6-bb79-47e69bd952b9\") " pod="service-telemetry/sg-bridge-2-build" Mar 16 00:30:17 crc kubenswrapper[4816]: I0316 00:30:17.428355 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/e28d6969-ebed-4cf6-bb79-47e69bd952b9-node-pullsecrets\") pod \"sg-bridge-2-build\" (UID: \"e28d6969-ebed-4cf6-bb79-47e69bd952b9\") " pod="service-telemetry/sg-bridge-2-build" Mar 16 00:30:17 crc kubenswrapper[4816]: I0316 00:30:17.428513 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/e28d6969-ebed-4cf6-bb79-47e69bd952b9-build-blob-cache\") pod \"sg-bridge-2-build\" (UID: \"e28d6969-ebed-4cf6-bb79-47e69bd952b9\") " pod="service-telemetry/sg-bridge-2-build" Mar 16 00:30:17 crc kubenswrapper[4816]: I0316 00:30:17.428826 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/e28d6969-ebed-4cf6-bb79-47e69bd952b9-node-pullsecrets\") pod \"sg-bridge-2-build\" (UID: \"e28d6969-ebed-4cf6-bb79-47e69bd952b9\") " pod="service-telemetry/sg-bridge-2-build" Mar 16 00:30:17 crc kubenswrapper[4816]: I0316 00:30:17.429059 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/e28d6969-ebed-4cf6-bb79-47e69bd952b9-buildworkdir\") pod \"sg-bridge-2-build\" (UID: \"e28d6969-ebed-4cf6-bb79-47e69bd952b9\") " pod="service-telemetry/sg-bridge-2-build" Mar 16 00:30:17 crc kubenswrapper[4816]: I0316 00:30:17.429279 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/e28d6969-ebed-4cf6-bb79-47e69bd952b9-buildworkdir\") pod \"sg-bridge-2-build\" (UID: \"e28d6969-ebed-4cf6-bb79-47e69bd952b9\") " pod="service-telemetry/sg-bridge-2-build" Mar 16 00:30:17 crc kubenswrapper[4816]: I0316 00:30:17.429472 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/e28d6969-ebed-4cf6-bb79-47e69bd952b9-container-storage-run\") pod \"sg-bridge-2-build\" (UID: \"e28d6969-ebed-4cf6-bb79-47e69bd952b9\") " pod="service-telemetry/sg-bridge-2-build" Mar 16 00:30:17 crc kubenswrapper[4816]: I0316 00:30:17.429728 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xvt9g\" (UniqueName: \"kubernetes.io/projected/e28d6969-ebed-4cf6-bb79-47e69bd952b9-kube-api-access-xvt9g\") pod \"sg-bridge-2-build\" (UID: \"e28d6969-ebed-4cf6-bb79-47e69bd952b9\") " pod="service-telemetry/sg-bridge-2-build" Mar 16 00:30:17 crc kubenswrapper[4816]: I0316 00:30:17.429995 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/e28d6969-ebed-4cf6-bb79-47e69bd952b9-buildcachedir\") pod \"sg-bridge-2-build\" (UID: \"e28d6969-ebed-4cf6-bb79-47e69bd952b9\") " pod="service-telemetry/sg-bridge-2-build" Mar 16 00:30:17 crc kubenswrapper[4816]: I0316 00:30:17.430283 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e28d6969-ebed-4cf6-bb79-47e69bd952b9-build-ca-bundles\") pod \"sg-bridge-2-build\" (UID: \"e28d6969-ebed-4cf6-bb79-47e69bd952b9\") " pod="service-telemetry/sg-bridge-2-build" Mar 16 00:30:17 crc kubenswrapper[4816]: I0316 00:30:17.430518 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/e28d6969-ebed-4cf6-bb79-47e69bd952b9-build-system-configs\") pod \"sg-bridge-2-build\" (UID: \"e28d6969-ebed-4cf6-bb79-47e69bd952b9\") " pod="service-telemetry/sg-bridge-2-build" Mar 16 00:30:17 crc kubenswrapper[4816]: I0316 00:30:17.430767 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/e28d6969-ebed-4cf6-bb79-47e69bd952b9-container-storage-root\") pod \"sg-bridge-2-build\" (UID: \"e28d6969-ebed-4cf6-bb79-47e69bd952b9\") " pod="service-telemetry/sg-bridge-2-build" Mar 16 00:30:17 crc kubenswrapper[4816]: I0316 00:30:17.430071 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/e28d6969-ebed-4cf6-bb79-47e69bd952b9-buildcachedir\") pod \"sg-bridge-2-build\" (UID: \"e28d6969-ebed-4cf6-bb79-47e69bd952b9\") " pod="service-telemetry/sg-bridge-2-build" Mar 16 00:30:17 crc kubenswrapper[4816]: I0316 00:30:17.429656 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/e28d6969-ebed-4cf6-bb79-47e69bd952b9-container-storage-run\") pod \"sg-bridge-2-build\" (UID: \"e28d6969-ebed-4cf6-bb79-47e69bd952b9\") " pod="service-telemetry/sg-bridge-2-build" Mar 16 00:30:17 crc kubenswrapper[4816]: I0316 00:30:17.431947 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-fs5z5-pull\" (UniqueName: \"kubernetes.io/secret/e28d6969-ebed-4cf6-bb79-47e69bd952b9-builder-dockercfg-fs5z5-pull\") pod \"sg-bridge-2-build\" (UID: \"e28d6969-ebed-4cf6-bb79-47e69bd952b9\") " pod="service-telemetry/sg-bridge-2-build" Mar 16 00:30:17 crc kubenswrapper[4816]: I0316 00:30:17.432292 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e28d6969-ebed-4cf6-bb79-47e69bd952b9-build-ca-bundles\") pod \"sg-bridge-2-build\" (UID: \"e28d6969-ebed-4cf6-bb79-47e69bd952b9\") " pod="service-telemetry/sg-bridge-2-build" Mar 16 00:30:17 crc kubenswrapper[4816]: I0316 00:30:17.437233 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/e28d6969-ebed-4cf6-bb79-47e69bd952b9-container-storage-root\") pod \"sg-bridge-2-build\" (UID: \"e28d6969-ebed-4cf6-bb79-47e69bd952b9\") " pod="service-telemetry/sg-bridge-2-build" Mar 16 00:30:17 crc kubenswrapper[4816]: I0316 00:30:17.437306 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e28d6969-ebed-4cf6-bb79-47e69bd952b9-build-proxy-ca-bundles\") pod \"sg-bridge-2-build\" (UID: \"e28d6969-ebed-4cf6-bb79-47e69bd952b9\") " pod="service-telemetry/sg-bridge-2-build" Mar 16 00:30:17 crc kubenswrapper[4816]: I0316 00:30:17.437535 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/e28d6969-ebed-4cf6-bb79-47e69bd952b9-build-system-configs\") pod \"sg-bridge-2-build\" (UID: \"e28d6969-ebed-4cf6-bb79-47e69bd952b9\") " pod="service-telemetry/sg-bridge-2-build" Mar 16 00:30:17 crc kubenswrapper[4816]: I0316 00:30:17.437948 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e28d6969-ebed-4cf6-bb79-47e69bd952b9-build-proxy-ca-bundles\") pod \"sg-bridge-2-build\" (UID: \"e28d6969-ebed-4cf6-bb79-47e69bd952b9\") " pod="service-telemetry/sg-bridge-2-build" Mar 16 00:30:17 crc kubenswrapper[4816]: I0316 00:30:17.437977 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-fs5z5-push\" (UniqueName: \"kubernetes.io/secret/e28d6969-ebed-4cf6-bb79-47e69bd952b9-builder-dockercfg-fs5z5-push\") pod \"sg-bridge-2-build\" (UID: \"e28d6969-ebed-4cf6-bb79-47e69bd952b9\") " pod="service-telemetry/sg-bridge-2-build" Mar 16 00:30:17 crc kubenswrapper[4816]: I0316 00:30:17.444771 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xvt9g\" (UniqueName: \"kubernetes.io/projected/e28d6969-ebed-4cf6-bb79-47e69bd952b9-kube-api-access-xvt9g\") pod \"sg-bridge-2-build\" (UID: \"e28d6969-ebed-4cf6-bb79-47e69bd952b9\") " pod="service-telemetry/sg-bridge-2-build" Mar 16 00:30:17 crc kubenswrapper[4816]: I0316 00:30:17.624316 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/sg-bridge-2-build" Mar 16 00:30:17 crc kubenswrapper[4816]: I0316 00:30:17.675565 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8a12a3fd-22a7-4cc3-ac53-7463cafa502b" path="/var/lib/kubelet/pods/8a12a3fd-22a7-4cc3-ac53-7463cafa502b/volumes" Mar 16 00:30:18 crc kubenswrapper[4816]: I0316 00:30:18.044404 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/sg-bridge-2-build"] Mar 16 00:30:18 crc kubenswrapper[4816]: I0316 00:30:18.897144 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/sg-bridge-2-build" event={"ID":"e28d6969-ebed-4cf6-bb79-47e69bd952b9","Type":"ContainerStarted","Data":"e670917e4f348cf3256ada42b19e331e3cfaa4ac463d6f46f290fec2ade196ca"} Mar 16 00:30:18 crc kubenswrapper[4816]: I0316 00:30:18.897377 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/sg-bridge-2-build" event={"ID":"e28d6969-ebed-4cf6-bb79-47e69bd952b9","Type":"ContainerStarted","Data":"b69b4af40f3f142efec97d6238cfcbf6adef1518048da6359bdb13cfce32e6b9"} Mar 16 00:30:19 crc kubenswrapper[4816]: E0316 00:30:19.010032 4816 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.102.83.158:52852->38.102.83.158:35591: write tcp 38.102.83.158:52852->38.102.83.158:35591: write: connection reset by peer Mar 16 00:30:19 crc kubenswrapper[4816]: I0316 00:30:19.906844 4816 generic.go:334] "Generic (PLEG): container finished" podID="e28d6969-ebed-4cf6-bb79-47e69bd952b9" containerID="e670917e4f348cf3256ada42b19e331e3cfaa4ac463d6f46f290fec2ade196ca" exitCode=0 Mar 16 00:30:19 crc kubenswrapper[4816]: I0316 00:30:19.906910 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/sg-bridge-2-build" event={"ID":"e28d6969-ebed-4cf6-bb79-47e69bd952b9","Type":"ContainerDied","Data":"e670917e4f348cf3256ada42b19e331e3cfaa4ac463d6f46f290fec2ade196ca"} Mar 16 00:30:20 crc kubenswrapper[4816]: I0316 00:30:20.917102 4816 generic.go:334] "Generic (PLEG): container finished" podID="e28d6969-ebed-4cf6-bb79-47e69bd952b9" containerID="3f1fc959adebb5ee4efbb86c12974091024b401490dc7b39e795e6fb43175c7f" exitCode=0 Mar 16 00:30:20 crc kubenswrapper[4816]: I0316 00:30:20.917149 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/sg-bridge-2-build" event={"ID":"e28d6969-ebed-4cf6-bb79-47e69bd952b9","Type":"ContainerDied","Data":"3f1fc959adebb5ee4efbb86c12974091024b401490dc7b39e795e6fb43175c7f"} Mar 16 00:30:20 crc kubenswrapper[4816]: I0316 00:30:20.989187 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_sg-bridge-2-build_e28d6969-ebed-4cf6-bb79-47e69bd952b9/manage-dockerfile/0.log" Mar 16 00:30:21 crc kubenswrapper[4816]: I0316 00:30:21.926696 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/sg-bridge-2-build" event={"ID":"e28d6969-ebed-4cf6-bb79-47e69bd952b9","Type":"ContainerStarted","Data":"9281601de1505bb064fa6238e6ff325c949730a49bc3d2962b8ca1dead5e53a7"} Mar 16 00:30:21 crc kubenswrapper[4816]: I0316 00:30:21.963364 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/sg-bridge-2-build" podStartSLOduration=4.963343551 podStartE2EDuration="4.963343551s" podCreationTimestamp="2026-03-16 00:30:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-16 00:30:21.961826042 +0000 UTC m=+1415.058126005" watchObservedRunningTime="2026-03-16 00:30:21.963343551 +0000 UTC m=+1415.059643504" Mar 16 00:31:04 crc kubenswrapper[4816]: I0316 00:31:04.311507 4816 scope.go:117] "RemoveContainer" containerID="6050167de1d894cd0016711271e17ed54f0e6320bd8403d36883159d39c3c966" Mar 16 00:31:06 crc kubenswrapper[4816]: I0316 00:31:06.254279 4816 generic.go:334] "Generic (PLEG): container finished" podID="e28d6969-ebed-4cf6-bb79-47e69bd952b9" containerID="9281601de1505bb064fa6238e6ff325c949730a49bc3d2962b8ca1dead5e53a7" exitCode=0 Mar 16 00:31:06 crc kubenswrapper[4816]: I0316 00:31:06.254342 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/sg-bridge-2-build" event={"ID":"e28d6969-ebed-4cf6-bb79-47e69bd952b9","Type":"ContainerDied","Data":"9281601de1505bb064fa6238e6ff325c949730a49bc3d2962b8ca1dead5e53a7"} Mar 16 00:31:07 crc kubenswrapper[4816]: I0316 00:31:07.489800 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/sg-bridge-2-build" Mar 16 00:31:07 crc kubenswrapper[4816]: I0316 00:31:07.625351 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e28d6969-ebed-4cf6-bb79-47e69bd952b9-build-proxy-ca-bundles\") pod \"e28d6969-ebed-4cf6-bb79-47e69bd952b9\" (UID: \"e28d6969-ebed-4cf6-bb79-47e69bd952b9\") " Mar 16 00:31:07 crc kubenswrapper[4816]: I0316 00:31:07.625409 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-fs5z5-push\" (UniqueName: \"kubernetes.io/secret/e28d6969-ebed-4cf6-bb79-47e69bd952b9-builder-dockercfg-fs5z5-push\") pod \"e28d6969-ebed-4cf6-bb79-47e69bd952b9\" (UID: \"e28d6969-ebed-4cf6-bb79-47e69bd952b9\") " Mar 16 00:31:07 crc kubenswrapper[4816]: I0316 00:31:07.625429 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/e28d6969-ebed-4cf6-bb79-47e69bd952b9-container-storage-root\") pod \"e28d6969-ebed-4cf6-bb79-47e69bd952b9\" (UID: \"e28d6969-ebed-4cf6-bb79-47e69bd952b9\") " Mar 16 00:31:07 crc kubenswrapper[4816]: I0316 00:31:07.625448 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/e28d6969-ebed-4cf6-bb79-47e69bd952b9-build-system-configs\") pod \"e28d6969-ebed-4cf6-bb79-47e69bd952b9\" (UID: \"e28d6969-ebed-4cf6-bb79-47e69bd952b9\") " Mar 16 00:31:07 crc kubenswrapper[4816]: I0316 00:31:07.625475 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/e28d6969-ebed-4cf6-bb79-47e69bd952b9-node-pullsecrets\") pod \"e28d6969-ebed-4cf6-bb79-47e69bd952b9\" (UID: \"e28d6969-ebed-4cf6-bb79-47e69bd952b9\") " Mar 16 00:31:07 crc kubenswrapper[4816]: I0316 00:31:07.625723 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-fs5z5-pull\" (UniqueName: \"kubernetes.io/secret/e28d6969-ebed-4cf6-bb79-47e69bd952b9-builder-dockercfg-fs5z5-pull\") pod \"e28d6969-ebed-4cf6-bb79-47e69bd952b9\" (UID: \"e28d6969-ebed-4cf6-bb79-47e69bd952b9\") " Mar 16 00:31:07 crc kubenswrapper[4816]: I0316 00:31:07.625764 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/e28d6969-ebed-4cf6-bb79-47e69bd952b9-build-blob-cache\") pod \"e28d6969-ebed-4cf6-bb79-47e69bd952b9\" (UID: \"e28d6969-ebed-4cf6-bb79-47e69bd952b9\") " Mar 16 00:31:07 crc kubenswrapper[4816]: I0316 00:31:07.625797 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/e28d6969-ebed-4cf6-bb79-47e69bd952b9-buildworkdir\") pod \"e28d6969-ebed-4cf6-bb79-47e69bd952b9\" (UID: \"e28d6969-ebed-4cf6-bb79-47e69bd952b9\") " Mar 16 00:31:07 crc kubenswrapper[4816]: I0316 00:31:07.625827 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e28d6969-ebed-4cf6-bb79-47e69bd952b9-build-ca-bundles\") pod \"e28d6969-ebed-4cf6-bb79-47e69bd952b9\" (UID: \"e28d6969-ebed-4cf6-bb79-47e69bd952b9\") " Mar 16 00:31:07 crc kubenswrapper[4816]: I0316 00:31:07.625798 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e28d6969-ebed-4cf6-bb79-47e69bd952b9-node-pullsecrets" (OuterVolumeSpecName: "node-pullsecrets") pod "e28d6969-ebed-4cf6-bb79-47e69bd952b9" (UID: "e28d6969-ebed-4cf6-bb79-47e69bd952b9"). InnerVolumeSpecName "node-pullsecrets". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 16 00:31:07 crc kubenswrapper[4816]: I0316 00:31:07.625852 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/e28d6969-ebed-4cf6-bb79-47e69bd952b9-container-storage-run\") pod \"e28d6969-ebed-4cf6-bb79-47e69bd952b9\" (UID: \"e28d6969-ebed-4cf6-bb79-47e69bd952b9\") " Mar 16 00:31:07 crc kubenswrapper[4816]: I0316 00:31:07.625887 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xvt9g\" (UniqueName: \"kubernetes.io/projected/e28d6969-ebed-4cf6-bb79-47e69bd952b9-kube-api-access-xvt9g\") pod \"e28d6969-ebed-4cf6-bb79-47e69bd952b9\" (UID: \"e28d6969-ebed-4cf6-bb79-47e69bd952b9\") " Mar 16 00:31:07 crc kubenswrapper[4816]: I0316 00:31:07.625919 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/e28d6969-ebed-4cf6-bb79-47e69bd952b9-buildcachedir\") pod \"e28d6969-ebed-4cf6-bb79-47e69bd952b9\" (UID: \"e28d6969-ebed-4cf6-bb79-47e69bd952b9\") " Mar 16 00:31:07 crc kubenswrapper[4816]: I0316 00:31:07.626195 4816 reconciler_common.go:293] "Volume detached for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/e28d6969-ebed-4cf6-bb79-47e69bd952b9-node-pullsecrets\") on node \"crc\" DevicePath \"\"" Mar 16 00:31:07 crc kubenswrapper[4816]: I0316 00:31:07.626213 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e28d6969-ebed-4cf6-bb79-47e69bd952b9-buildcachedir" (OuterVolumeSpecName: "buildcachedir") pod "e28d6969-ebed-4cf6-bb79-47e69bd952b9" (UID: "e28d6969-ebed-4cf6-bb79-47e69bd952b9"). InnerVolumeSpecName "buildcachedir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 16 00:31:07 crc kubenswrapper[4816]: I0316 00:31:07.626298 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e28d6969-ebed-4cf6-bb79-47e69bd952b9-build-system-configs" (OuterVolumeSpecName: "build-system-configs") pod "e28d6969-ebed-4cf6-bb79-47e69bd952b9" (UID: "e28d6969-ebed-4cf6-bb79-47e69bd952b9"). InnerVolumeSpecName "build-system-configs". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:31:07 crc kubenswrapper[4816]: I0316 00:31:07.626762 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e28d6969-ebed-4cf6-bb79-47e69bd952b9-build-proxy-ca-bundles" (OuterVolumeSpecName: "build-proxy-ca-bundles") pod "e28d6969-ebed-4cf6-bb79-47e69bd952b9" (UID: "e28d6969-ebed-4cf6-bb79-47e69bd952b9"). InnerVolumeSpecName "build-proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:31:07 crc kubenswrapper[4816]: I0316 00:31:07.627881 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e28d6969-ebed-4cf6-bb79-47e69bd952b9-container-storage-run" (OuterVolumeSpecName: "container-storage-run") pod "e28d6969-ebed-4cf6-bb79-47e69bd952b9" (UID: "e28d6969-ebed-4cf6-bb79-47e69bd952b9"). InnerVolumeSpecName "container-storage-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 16 00:31:07 crc kubenswrapper[4816]: I0316 00:31:07.628724 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e28d6969-ebed-4cf6-bb79-47e69bd952b9-build-ca-bundles" (OuterVolumeSpecName: "build-ca-bundles") pod "e28d6969-ebed-4cf6-bb79-47e69bd952b9" (UID: "e28d6969-ebed-4cf6-bb79-47e69bd952b9"). InnerVolumeSpecName "build-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:31:07 crc kubenswrapper[4816]: I0316 00:31:07.629092 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e28d6969-ebed-4cf6-bb79-47e69bd952b9-buildworkdir" (OuterVolumeSpecName: "buildworkdir") pod "e28d6969-ebed-4cf6-bb79-47e69bd952b9" (UID: "e28d6969-ebed-4cf6-bb79-47e69bd952b9"). InnerVolumeSpecName "buildworkdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 16 00:31:07 crc kubenswrapper[4816]: I0316 00:31:07.631078 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e28d6969-ebed-4cf6-bb79-47e69bd952b9-kube-api-access-xvt9g" (OuterVolumeSpecName: "kube-api-access-xvt9g") pod "e28d6969-ebed-4cf6-bb79-47e69bd952b9" (UID: "e28d6969-ebed-4cf6-bb79-47e69bd952b9"). InnerVolumeSpecName "kube-api-access-xvt9g". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:31:07 crc kubenswrapper[4816]: I0316 00:31:07.636307 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e28d6969-ebed-4cf6-bb79-47e69bd952b9-builder-dockercfg-fs5z5-push" (OuterVolumeSpecName: "builder-dockercfg-fs5z5-push") pod "e28d6969-ebed-4cf6-bb79-47e69bd952b9" (UID: "e28d6969-ebed-4cf6-bb79-47e69bd952b9"). InnerVolumeSpecName "builder-dockercfg-fs5z5-push". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 16 00:31:07 crc kubenswrapper[4816]: I0316 00:31:07.637167 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e28d6969-ebed-4cf6-bb79-47e69bd952b9-builder-dockercfg-fs5z5-pull" (OuterVolumeSpecName: "builder-dockercfg-fs5z5-pull") pod "e28d6969-ebed-4cf6-bb79-47e69bd952b9" (UID: "e28d6969-ebed-4cf6-bb79-47e69bd952b9"). InnerVolumeSpecName "builder-dockercfg-fs5z5-pull". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 16 00:31:07 crc kubenswrapper[4816]: I0316 00:31:07.731315 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e28d6969-ebed-4cf6-bb79-47e69bd952b9-build-blob-cache" (OuterVolumeSpecName: "build-blob-cache") pod "e28d6969-ebed-4cf6-bb79-47e69bd952b9" (UID: "e28d6969-ebed-4cf6-bb79-47e69bd952b9"). InnerVolumeSpecName "build-blob-cache". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 16 00:31:07 crc kubenswrapper[4816]: I0316 00:31:07.732212 4816 reconciler_common.go:293] "Volume detached for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e28d6969-ebed-4cf6-bb79-47e69bd952b9-build-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 16 00:31:07 crc kubenswrapper[4816]: I0316 00:31:07.732229 4816 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-fs5z5-push\" (UniqueName: \"kubernetes.io/secret/e28d6969-ebed-4cf6-bb79-47e69bd952b9-builder-dockercfg-fs5z5-push\") on node \"crc\" DevicePath \"\"" Mar 16 00:31:07 crc kubenswrapper[4816]: I0316 00:31:07.732240 4816 reconciler_common.go:293] "Volume detached for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/e28d6969-ebed-4cf6-bb79-47e69bd952b9-build-system-configs\") on node \"crc\" DevicePath \"\"" Mar 16 00:31:07 crc kubenswrapper[4816]: I0316 00:31:07.732249 4816 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-fs5z5-pull\" (UniqueName: \"kubernetes.io/secret/e28d6969-ebed-4cf6-bb79-47e69bd952b9-builder-dockercfg-fs5z5-pull\") on node \"crc\" DevicePath \"\"" Mar 16 00:31:07 crc kubenswrapper[4816]: I0316 00:31:07.732257 4816 reconciler_common.go:293] "Volume detached for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/e28d6969-ebed-4cf6-bb79-47e69bd952b9-build-blob-cache\") on node \"crc\" DevicePath \"\"" Mar 16 00:31:07 crc kubenswrapper[4816]: I0316 00:31:07.732266 4816 reconciler_common.go:293] "Volume detached for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/e28d6969-ebed-4cf6-bb79-47e69bd952b9-buildworkdir\") on node \"crc\" DevicePath \"\"" Mar 16 00:31:07 crc kubenswrapper[4816]: I0316 00:31:07.732276 4816 reconciler_common.go:293] "Volume detached for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e28d6969-ebed-4cf6-bb79-47e69bd952b9-build-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 16 00:31:07 crc kubenswrapper[4816]: I0316 00:31:07.732284 4816 reconciler_common.go:293] "Volume detached for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/e28d6969-ebed-4cf6-bb79-47e69bd952b9-container-storage-run\") on node \"crc\" DevicePath \"\"" Mar 16 00:31:07 crc kubenswrapper[4816]: I0316 00:31:07.732292 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xvt9g\" (UniqueName: \"kubernetes.io/projected/e28d6969-ebed-4cf6-bb79-47e69bd952b9-kube-api-access-xvt9g\") on node \"crc\" DevicePath \"\"" Mar 16 00:31:07 crc kubenswrapper[4816]: I0316 00:31:07.732300 4816 reconciler_common.go:293] "Volume detached for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/e28d6969-ebed-4cf6-bb79-47e69bd952b9-buildcachedir\") on node \"crc\" DevicePath \"\"" Mar 16 00:31:08 crc kubenswrapper[4816]: I0316 00:31:08.269572 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/sg-bridge-2-build" event={"ID":"e28d6969-ebed-4cf6-bb79-47e69bd952b9","Type":"ContainerDied","Data":"b69b4af40f3f142efec97d6238cfcbf6adef1518048da6359bdb13cfce32e6b9"} Mar 16 00:31:08 crc kubenswrapper[4816]: I0316 00:31:08.269619 4816 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b69b4af40f3f142efec97d6238cfcbf6adef1518048da6359bdb13cfce32e6b9" Mar 16 00:31:08 crc kubenswrapper[4816]: I0316 00:31:08.269743 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/sg-bridge-2-build" Mar 16 00:31:08 crc kubenswrapper[4816]: I0316 00:31:08.383516 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e28d6969-ebed-4cf6-bb79-47e69bd952b9-container-storage-root" (OuterVolumeSpecName: "container-storage-root") pod "e28d6969-ebed-4cf6-bb79-47e69bd952b9" (UID: "e28d6969-ebed-4cf6-bb79-47e69bd952b9"). InnerVolumeSpecName "container-storage-root". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 16 00:31:08 crc kubenswrapper[4816]: I0316 00:31:08.443544 4816 reconciler_common.go:293] "Volume detached for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/e28d6969-ebed-4cf6-bb79-47e69bd952b9-container-storage-root\") on node \"crc\" DevicePath \"\"" Mar 16 00:31:11 crc kubenswrapper[4816]: I0316 00:31:11.637991 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/prometheus-webhook-snmp-1-build"] Mar 16 00:31:11 crc kubenswrapper[4816]: E0316 00:31:11.638444 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e28d6969-ebed-4cf6-bb79-47e69bd952b9" containerName="manage-dockerfile" Mar 16 00:31:11 crc kubenswrapper[4816]: I0316 00:31:11.638456 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="e28d6969-ebed-4cf6-bb79-47e69bd952b9" containerName="manage-dockerfile" Mar 16 00:31:11 crc kubenswrapper[4816]: E0316 00:31:11.638475 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e28d6969-ebed-4cf6-bb79-47e69bd952b9" containerName="docker-build" Mar 16 00:31:11 crc kubenswrapper[4816]: I0316 00:31:11.638481 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="e28d6969-ebed-4cf6-bb79-47e69bd952b9" containerName="docker-build" Mar 16 00:31:11 crc kubenswrapper[4816]: E0316 00:31:11.638490 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e28d6969-ebed-4cf6-bb79-47e69bd952b9" containerName="git-clone" Mar 16 00:31:11 crc kubenswrapper[4816]: I0316 00:31:11.638496 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="e28d6969-ebed-4cf6-bb79-47e69bd952b9" containerName="git-clone" Mar 16 00:31:11 crc kubenswrapper[4816]: I0316 00:31:11.638599 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="e28d6969-ebed-4cf6-bb79-47e69bd952b9" containerName="docker-build" Mar 16 00:31:11 crc kubenswrapper[4816]: I0316 00:31:11.639176 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 16 00:31:11 crc kubenswrapper[4816]: I0316 00:31:11.642413 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"prometheus-webhook-snmp-1-global-ca" Mar 16 00:31:11 crc kubenswrapper[4816]: I0316 00:31:11.642449 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"prometheus-webhook-snmp-1-ca" Mar 16 00:31:11 crc kubenswrapper[4816]: I0316 00:31:11.642742 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"prometheus-webhook-snmp-1-sys-config" Mar 16 00:31:11 crc kubenswrapper[4816]: I0316 00:31:11.649592 4816 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"builder-dockercfg-fs5z5" Mar 16 00:31:11 crc kubenswrapper[4816]: I0316 00:31:11.661407 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/prometheus-webhook-snmp-1-build"] Mar 16 00:31:11 crc kubenswrapper[4816]: I0316 00:31:11.686222 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/a0bd96c2-9ded-4a4b-8b36-d6bcf97135c3-container-storage-run\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"a0bd96c2-9ded-4a4b-8b36-d6bcf97135c3\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 16 00:31:11 crc kubenswrapper[4816]: I0316 00:31:11.686522 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/a0bd96c2-9ded-4a4b-8b36-d6bcf97135c3-build-system-configs\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"a0bd96c2-9ded-4a4b-8b36-d6bcf97135c3\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 16 00:31:11 crc kubenswrapper[4816]: I0316 00:31:11.686643 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/a0bd96c2-9ded-4a4b-8b36-d6bcf97135c3-container-storage-root\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"a0bd96c2-9ded-4a4b-8b36-d6bcf97135c3\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 16 00:31:11 crc kubenswrapper[4816]: I0316 00:31:11.686753 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/a0bd96c2-9ded-4a4b-8b36-d6bcf97135c3-node-pullsecrets\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"a0bd96c2-9ded-4a4b-8b36-d6bcf97135c3\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 16 00:31:11 crc kubenswrapper[4816]: I0316 00:31:11.686870 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-fs5z5-push\" (UniqueName: \"kubernetes.io/secret/a0bd96c2-9ded-4a4b-8b36-d6bcf97135c3-builder-dockercfg-fs5z5-push\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"a0bd96c2-9ded-4a4b-8b36-d6bcf97135c3\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 16 00:31:11 crc kubenswrapper[4816]: I0316 00:31:11.686968 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-fs5z5-pull\" (UniqueName: \"kubernetes.io/secret/a0bd96c2-9ded-4a4b-8b36-d6bcf97135c3-builder-dockercfg-fs5z5-pull\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"a0bd96c2-9ded-4a4b-8b36-d6bcf97135c3\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 16 00:31:11 crc kubenswrapper[4816]: I0316 00:31:11.687095 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/a0bd96c2-9ded-4a4b-8b36-d6bcf97135c3-build-proxy-ca-bundles\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"a0bd96c2-9ded-4a4b-8b36-d6bcf97135c3\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 16 00:31:11 crc kubenswrapper[4816]: I0316 00:31:11.687183 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sbrs7\" (UniqueName: \"kubernetes.io/projected/a0bd96c2-9ded-4a4b-8b36-d6bcf97135c3-kube-api-access-sbrs7\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"a0bd96c2-9ded-4a4b-8b36-d6bcf97135c3\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 16 00:31:11 crc kubenswrapper[4816]: I0316 00:31:11.687300 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/a0bd96c2-9ded-4a4b-8b36-d6bcf97135c3-build-ca-bundles\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"a0bd96c2-9ded-4a4b-8b36-d6bcf97135c3\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 16 00:31:11 crc kubenswrapper[4816]: I0316 00:31:11.687393 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/a0bd96c2-9ded-4a4b-8b36-d6bcf97135c3-build-blob-cache\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"a0bd96c2-9ded-4a4b-8b36-d6bcf97135c3\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 16 00:31:11 crc kubenswrapper[4816]: I0316 00:31:11.687491 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/a0bd96c2-9ded-4a4b-8b36-d6bcf97135c3-buildworkdir\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"a0bd96c2-9ded-4a4b-8b36-d6bcf97135c3\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 16 00:31:11 crc kubenswrapper[4816]: I0316 00:31:11.687602 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/a0bd96c2-9ded-4a4b-8b36-d6bcf97135c3-buildcachedir\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"a0bd96c2-9ded-4a4b-8b36-d6bcf97135c3\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 16 00:31:11 crc kubenswrapper[4816]: I0316 00:31:11.788516 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/a0bd96c2-9ded-4a4b-8b36-d6bcf97135c3-container-storage-run\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"a0bd96c2-9ded-4a4b-8b36-d6bcf97135c3\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 16 00:31:11 crc kubenswrapper[4816]: I0316 00:31:11.788592 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/a0bd96c2-9ded-4a4b-8b36-d6bcf97135c3-build-system-configs\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"a0bd96c2-9ded-4a4b-8b36-d6bcf97135c3\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 16 00:31:11 crc kubenswrapper[4816]: I0316 00:31:11.788622 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/a0bd96c2-9ded-4a4b-8b36-d6bcf97135c3-container-storage-root\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"a0bd96c2-9ded-4a4b-8b36-d6bcf97135c3\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 16 00:31:11 crc kubenswrapper[4816]: I0316 00:31:11.788658 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/a0bd96c2-9ded-4a4b-8b36-d6bcf97135c3-node-pullsecrets\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"a0bd96c2-9ded-4a4b-8b36-d6bcf97135c3\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 16 00:31:11 crc kubenswrapper[4816]: I0316 00:31:11.788692 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-fs5z5-push\" (UniqueName: \"kubernetes.io/secret/a0bd96c2-9ded-4a4b-8b36-d6bcf97135c3-builder-dockercfg-fs5z5-push\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"a0bd96c2-9ded-4a4b-8b36-d6bcf97135c3\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 16 00:31:11 crc kubenswrapper[4816]: I0316 00:31:11.788715 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-fs5z5-pull\" (UniqueName: \"kubernetes.io/secret/a0bd96c2-9ded-4a4b-8b36-d6bcf97135c3-builder-dockercfg-fs5z5-pull\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"a0bd96c2-9ded-4a4b-8b36-d6bcf97135c3\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 16 00:31:11 crc kubenswrapper[4816]: I0316 00:31:11.788754 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/a0bd96c2-9ded-4a4b-8b36-d6bcf97135c3-build-proxy-ca-bundles\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"a0bd96c2-9ded-4a4b-8b36-d6bcf97135c3\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 16 00:31:11 crc kubenswrapper[4816]: I0316 00:31:11.788775 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sbrs7\" (UniqueName: \"kubernetes.io/projected/a0bd96c2-9ded-4a4b-8b36-d6bcf97135c3-kube-api-access-sbrs7\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"a0bd96c2-9ded-4a4b-8b36-d6bcf97135c3\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 16 00:31:11 crc kubenswrapper[4816]: I0316 00:31:11.788808 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/a0bd96c2-9ded-4a4b-8b36-d6bcf97135c3-build-ca-bundles\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"a0bd96c2-9ded-4a4b-8b36-d6bcf97135c3\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 16 00:31:11 crc kubenswrapper[4816]: I0316 00:31:11.788833 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/a0bd96c2-9ded-4a4b-8b36-d6bcf97135c3-build-blob-cache\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"a0bd96c2-9ded-4a4b-8b36-d6bcf97135c3\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 16 00:31:11 crc kubenswrapper[4816]: I0316 00:31:11.788862 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/a0bd96c2-9ded-4a4b-8b36-d6bcf97135c3-buildworkdir\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"a0bd96c2-9ded-4a4b-8b36-d6bcf97135c3\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 16 00:31:11 crc kubenswrapper[4816]: I0316 00:31:11.788885 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/a0bd96c2-9ded-4a4b-8b36-d6bcf97135c3-buildcachedir\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"a0bd96c2-9ded-4a4b-8b36-d6bcf97135c3\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 16 00:31:11 crc kubenswrapper[4816]: I0316 00:31:11.788976 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/a0bd96c2-9ded-4a4b-8b36-d6bcf97135c3-buildcachedir\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"a0bd96c2-9ded-4a4b-8b36-d6bcf97135c3\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 16 00:31:11 crc kubenswrapper[4816]: I0316 00:31:11.789026 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/a0bd96c2-9ded-4a4b-8b36-d6bcf97135c3-container-storage-run\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"a0bd96c2-9ded-4a4b-8b36-d6bcf97135c3\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 16 00:31:11 crc kubenswrapper[4816]: I0316 00:31:11.789512 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/a0bd96c2-9ded-4a4b-8b36-d6bcf97135c3-build-system-configs\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"a0bd96c2-9ded-4a4b-8b36-d6bcf97135c3\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 16 00:31:11 crc kubenswrapper[4816]: I0316 00:31:11.789596 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/a0bd96c2-9ded-4a4b-8b36-d6bcf97135c3-build-proxy-ca-bundles\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"a0bd96c2-9ded-4a4b-8b36-d6bcf97135c3\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 16 00:31:11 crc kubenswrapper[4816]: I0316 00:31:11.789763 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/a0bd96c2-9ded-4a4b-8b36-d6bcf97135c3-buildworkdir\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"a0bd96c2-9ded-4a4b-8b36-d6bcf97135c3\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 16 00:31:11 crc kubenswrapper[4816]: I0316 00:31:11.789880 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/a0bd96c2-9ded-4a4b-8b36-d6bcf97135c3-node-pullsecrets\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"a0bd96c2-9ded-4a4b-8b36-d6bcf97135c3\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 16 00:31:11 crc kubenswrapper[4816]: I0316 00:31:11.790589 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/a0bd96c2-9ded-4a4b-8b36-d6bcf97135c3-build-ca-bundles\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"a0bd96c2-9ded-4a4b-8b36-d6bcf97135c3\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 16 00:31:11 crc kubenswrapper[4816]: I0316 00:31:11.790167 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/a0bd96c2-9ded-4a4b-8b36-d6bcf97135c3-build-blob-cache\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"a0bd96c2-9ded-4a4b-8b36-d6bcf97135c3\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 16 00:31:11 crc kubenswrapper[4816]: I0316 00:31:11.790836 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/a0bd96c2-9ded-4a4b-8b36-d6bcf97135c3-container-storage-root\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"a0bd96c2-9ded-4a4b-8b36-d6bcf97135c3\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 16 00:31:11 crc kubenswrapper[4816]: I0316 00:31:11.794947 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-fs5z5-pull\" (UniqueName: \"kubernetes.io/secret/a0bd96c2-9ded-4a4b-8b36-d6bcf97135c3-builder-dockercfg-fs5z5-pull\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"a0bd96c2-9ded-4a4b-8b36-d6bcf97135c3\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 16 00:31:11 crc kubenswrapper[4816]: I0316 00:31:11.796297 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-fs5z5-push\" (UniqueName: \"kubernetes.io/secret/a0bd96c2-9ded-4a4b-8b36-d6bcf97135c3-builder-dockercfg-fs5z5-push\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"a0bd96c2-9ded-4a4b-8b36-d6bcf97135c3\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 16 00:31:11 crc kubenswrapper[4816]: I0316 00:31:11.808947 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sbrs7\" (UniqueName: \"kubernetes.io/projected/a0bd96c2-9ded-4a4b-8b36-d6bcf97135c3-kube-api-access-sbrs7\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"a0bd96c2-9ded-4a4b-8b36-d6bcf97135c3\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 16 00:31:11 crc kubenswrapper[4816]: I0316 00:31:11.952062 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 16 00:31:12 crc kubenswrapper[4816]: I0316 00:31:12.399500 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/prometheus-webhook-snmp-1-build"] Mar 16 00:31:13 crc kubenswrapper[4816]: I0316 00:31:13.312359 4816 generic.go:334] "Generic (PLEG): container finished" podID="a0bd96c2-9ded-4a4b-8b36-d6bcf97135c3" containerID="f5327b247a9795d7266c3a75f38fe58418f7533461eb2ded2bf746b2d1150a78" exitCode=0 Mar 16 00:31:13 crc kubenswrapper[4816]: I0316 00:31:13.312412 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/prometheus-webhook-snmp-1-build" event={"ID":"a0bd96c2-9ded-4a4b-8b36-d6bcf97135c3","Type":"ContainerDied","Data":"f5327b247a9795d7266c3a75f38fe58418f7533461eb2ded2bf746b2d1150a78"} Mar 16 00:31:13 crc kubenswrapper[4816]: I0316 00:31:13.312443 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/prometheus-webhook-snmp-1-build" event={"ID":"a0bd96c2-9ded-4a4b-8b36-d6bcf97135c3","Type":"ContainerStarted","Data":"65c516056523a7bbc848a7d18a23fbbd58a3dce1c2461275ee579b7c5dc75a6c"} Mar 16 00:31:14 crc kubenswrapper[4816]: I0316 00:31:14.320824 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/prometheus-webhook-snmp-1-build" event={"ID":"a0bd96c2-9ded-4a4b-8b36-d6bcf97135c3","Type":"ContainerStarted","Data":"d795bb859d1793e48db35fa3999d9de3e107ccefd779c6afab55acb3c338abb7"} Mar 16 00:31:14 crc kubenswrapper[4816]: I0316 00:31:14.356664 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/prometheus-webhook-snmp-1-build" podStartSLOduration=3.356644396 podStartE2EDuration="3.356644396s" podCreationTimestamp="2026-03-16 00:31:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-16 00:31:14.351181385 +0000 UTC m=+1467.447481348" watchObservedRunningTime="2026-03-16 00:31:14.356644396 +0000 UTC m=+1467.452944349" Mar 16 00:31:22 crc kubenswrapper[4816]: I0316 00:31:22.362698 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["service-telemetry/prometheus-webhook-snmp-1-build"] Mar 16 00:31:22 crc kubenswrapper[4816]: I0316 00:31:22.363427 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="service-telemetry/prometheus-webhook-snmp-1-build" podUID="a0bd96c2-9ded-4a4b-8b36-d6bcf97135c3" containerName="docker-build" containerID="cri-o://d795bb859d1793e48db35fa3999d9de3e107ccefd779c6afab55acb3c338abb7" gracePeriod=30 Mar 16 00:31:22 crc kubenswrapper[4816]: I0316 00:31:22.701492 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_prometheus-webhook-snmp-1-build_a0bd96c2-9ded-4a4b-8b36-d6bcf97135c3/docker-build/0.log" Mar 16 00:31:22 crc kubenswrapper[4816]: I0316 00:31:22.702057 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 16 00:31:22 crc kubenswrapper[4816]: I0316 00:31:22.766451 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/a0bd96c2-9ded-4a4b-8b36-d6bcf97135c3-node-pullsecrets\") pod \"a0bd96c2-9ded-4a4b-8b36-d6bcf97135c3\" (UID: \"a0bd96c2-9ded-4a4b-8b36-d6bcf97135c3\") " Mar 16 00:31:22 crc kubenswrapper[4816]: I0316 00:31:22.766563 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/a0bd96c2-9ded-4a4b-8b36-d6bcf97135c3-build-system-configs\") pod \"a0bd96c2-9ded-4a4b-8b36-d6bcf97135c3\" (UID: \"a0bd96c2-9ded-4a4b-8b36-d6bcf97135c3\") " Mar 16 00:31:22 crc kubenswrapper[4816]: I0316 00:31:22.766570 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a0bd96c2-9ded-4a4b-8b36-d6bcf97135c3-node-pullsecrets" (OuterVolumeSpecName: "node-pullsecrets") pod "a0bd96c2-9ded-4a4b-8b36-d6bcf97135c3" (UID: "a0bd96c2-9ded-4a4b-8b36-d6bcf97135c3"). InnerVolumeSpecName "node-pullsecrets". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 16 00:31:22 crc kubenswrapper[4816]: I0316 00:31:22.766593 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/a0bd96c2-9ded-4a4b-8b36-d6bcf97135c3-buildcachedir\") pod \"a0bd96c2-9ded-4a4b-8b36-d6bcf97135c3\" (UID: \"a0bd96c2-9ded-4a4b-8b36-d6bcf97135c3\") " Mar 16 00:31:22 crc kubenswrapper[4816]: I0316 00:31:22.766632 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a0bd96c2-9ded-4a4b-8b36-d6bcf97135c3-buildcachedir" (OuterVolumeSpecName: "buildcachedir") pod "a0bd96c2-9ded-4a4b-8b36-d6bcf97135c3" (UID: "a0bd96c2-9ded-4a4b-8b36-d6bcf97135c3"). InnerVolumeSpecName "buildcachedir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 16 00:31:22 crc kubenswrapper[4816]: I0316 00:31:22.766689 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-fs5z5-push\" (UniqueName: \"kubernetes.io/secret/a0bd96c2-9ded-4a4b-8b36-d6bcf97135c3-builder-dockercfg-fs5z5-push\") pod \"a0bd96c2-9ded-4a4b-8b36-d6bcf97135c3\" (UID: \"a0bd96c2-9ded-4a4b-8b36-d6bcf97135c3\") " Mar 16 00:31:22 crc kubenswrapper[4816]: I0316 00:31:22.766758 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/a0bd96c2-9ded-4a4b-8b36-d6bcf97135c3-build-blob-cache\") pod \"a0bd96c2-9ded-4a4b-8b36-d6bcf97135c3\" (UID: \"a0bd96c2-9ded-4a4b-8b36-d6bcf97135c3\") " Mar 16 00:31:22 crc kubenswrapper[4816]: I0316 00:31:22.766788 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/a0bd96c2-9ded-4a4b-8b36-d6bcf97135c3-build-ca-bundles\") pod \"a0bd96c2-9ded-4a4b-8b36-d6bcf97135c3\" (UID: \"a0bd96c2-9ded-4a4b-8b36-d6bcf97135c3\") " Mar 16 00:31:22 crc kubenswrapper[4816]: I0316 00:31:22.766808 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-fs5z5-pull\" (UniqueName: \"kubernetes.io/secret/a0bd96c2-9ded-4a4b-8b36-d6bcf97135c3-builder-dockercfg-fs5z5-pull\") pod \"a0bd96c2-9ded-4a4b-8b36-d6bcf97135c3\" (UID: \"a0bd96c2-9ded-4a4b-8b36-d6bcf97135c3\") " Mar 16 00:31:22 crc kubenswrapper[4816]: I0316 00:31:22.766829 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/a0bd96c2-9ded-4a4b-8b36-d6bcf97135c3-container-storage-run\") pod \"a0bd96c2-9ded-4a4b-8b36-d6bcf97135c3\" (UID: \"a0bd96c2-9ded-4a4b-8b36-d6bcf97135c3\") " Mar 16 00:31:22 crc kubenswrapper[4816]: I0316 00:31:22.766884 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/a0bd96c2-9ded-4a4b-8b36-d6bcf97135c3-buildworkdir\") pod \"a0bd96c2-9ded-4a4b-8b36-d6bcf97135c3\" (UID: \"a0bd96c2-9ded-4a4b-8b36-d6bcf97135c3\") " Mar 16 00:31:22 crc kubenswrapper[4816]: I0316 00:31:22.766904 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sbrs7\" (UniqueName: \"kubernetes.io/projected/a0bd96c2-9ded-4a4b-8b36-d6bcf97135c3-kube-api-access-sbrs7\") pod \"a0bd96c2-9ded-4a4b-8b36-d6bcf97135c3\" (UID: \"a0bd96c2-9ded-4a4b-8b36-d6bcf97135c3\") " Mar 16 00:31:22 crc kubenswrapper[4816]: I0316 00:31:22.766925 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/a0bd96c2-9ded-4a4b-8b36-d6bcf97135c3-build-proxy-ca-bundles\") pod \"a0bd96c2-9ded-4a4b-8b36-d6bcf97135c3\" (UID: \"a0bd96c2-9ded-4a4b-8b36-d6bcf97135c3\") " Mar 16 00:31:22 crc kubenswrapper[4816]: I0316 00:31:22.766963 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/a0bd96c2-9ded-4a4b-8b36-d6bcf97135c3-container-storage-root\") pod \"a0bd96c2-9ded-4a4b-8b36-d6bcf97135c3\" (UID: \"a0bd96c2-9ded-4a4b-8b36-d6bcf97135c3\") " Mar 16 00:31:22 crc kubenswrapper[4816]: I0316 00:31:22.767855 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a0bd96c2-9ded-4a4b-8b36-d6bcf97135c3-buildworkdir" (OuterVolumeSpecName: "buildworkdir") pod "a0bd96c2-9ded-4a4b-8b36-d6bcf97135c3" (UID: "a0bd96c2-9ded-4a4b-8b36-d6bcf97135c3"). InnerVolumeSpecName "buildworkdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 16 00:31:22 crc kubenswrapper[4816]: I0316 00:31:22.767988 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a0bd96c2-9ded-4a4b-8b36-d6bcf97135c3-build-system-configs" (OuterVolumeSpecName: "build-system-configs") pod "a0bd96c2-9ded-4a4b-8b36-d6bcf97135c3" (UID: "a0bd96c2-9ded-4a4b-8b36-d6bcf97135c3"). InnerVolumeSpecName "build-system-configs". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:31:22 crc kubenswrapper[4816]: I0316 00:31:22.768027 4816 reconciler_common.go:293] "Volume detached for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/a0bd96c2-9ded-4a4b-8b36-d6bcf97135c3-buildcachedir\") on node \"crc\" DevicePath \"\"" Mar 16 00:31:22 crc kubenswrapper[4816]: I0316 00:31:22.768049 4816 reconciler_common.go:293] "Volume detached for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/a0bd96c2-9ded-4a4b-8b36-d6bcf97135c3-node-pullsecrets\") on node \"crc\" DevicePath \"\"" Mar 16 00:31:22 crc kubenswrapper[4816]: I0316 00:31:22.768189 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a0bd96c2-9ded-4a4b-8b36-d6bcf97135c3-build-proxy-ca-bundles" (OuterVolumeSpecName: "build-proxy-ca-bundles") pod "a0bd96c2-9ded-4a4b-8b36-d6bcf97135c3" (UID: "a0bd96c2-9ded-4a4b-8b36-d6bcf97135c3"). InnerVolumeSpecName "build-proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:31:22 crc kubenswrapper[4816]: I0316 00:31:22.768649 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a0bd96c2-9ded-4a4b-8b36-d6bcf97135c3-build-ca-bundles" (OuterVolumeSpecName: "build-ca-bundles") pod "a0bd96c2-9ded-4a4b-8b36-d6bcf97135c3" (UID: "a0bd96c2-9ded-4a4b-8b36-d6bcf97135c3"). InnerVolumeSpecName "build-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:31:22 crc kubenswrapper[4816]: I0316 00:31:22.769171 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a0bd96c2-9ded-4a4b-8b36-d6bcf97135c3-container-storage-run" (OuterVolumeSpecName: "container-storage-run") pod "a0bd96c2-9ded-4a4b-8b36-d6bcf97135c3" (UID: "a0bd96c2-9ded-4a4b-8b36-d6bcf97135c3"). InnerVolumeSpecName "container-storage-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 16 00:31:22 crc kubenswrapper[4816]: I0316 00:31:22.771909 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0bd96c2-9ded-4a4b-8b36-d6bcf97135c3-builder-dockercfg-fs5z5-pull" (OuterVolumeSpecName: "builder-dockercfg-fs5z5-pull") pod "a0bd96c2-9ded-4a4b-8b36-d6bcf97135c3" (UID: "a0bd96c2-9ded-4a4b-8b36-d6bcf97135c3"). InnerVolumeSpecName "builder-dockercfg-fs5z5-pull". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 16 00:31:22 crc kubenswrapper[4816]: I0316 00:31:22.773271 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0bd96c2-9ded-4a4b-8b36-d6bcf97135c3-kube-api-access-sbrs7" (OuterVolumeSpecName: "kube-api-access-sbrs7") pod "a0bd96c2-9ded-4a4b-8b36-d6bcf97135c3" (UID: "a0bd96c2-9ded-4a4b-8b36-d6bcf97135c3"). InnerVolumeSpecName "kube-api-access-sbrs7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:31:22 crc kubenswrapper[4816]: I0316 00:31:22.773650 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0bd96c2-9ded-4a4b-8b36-d6bcf97135c3-builder-dockercfg-fs5z5-push" (OuterVolumeSpecName: "builder-dockercfg-fs5z5-push") pod "a0bd96c2-9ded-4a4b-8b36-d6bcf97135c3" (UID: "a0bd96c2-9ded-4a4b-8b36-d6bcf97135c3"). InnerVolumeSpecName "builder-dockercfg-fs5z5-push". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 16 00:31:22 crc kubenswrapper[4816]: I0316 00:31:22.841796 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a0bd96c2-9ded-4a4b-8b36-d6bcf97135c3-build-blob-cache" (OuterVolumeSpecName: "build-blob-cache") pod "a0bd96c2-9ded-4a4b-8b36-d6bcf97135c3" (UID: "a0bd96c2-9ded-4a4b-8b36-d6bcf97135c3"). InnerVolumeSpecName "build-blob-cache". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 16 00:31:22 crc kubenswrapper[4816]: I0316 00:31:22.869913 4816 reconciler_common.go:293] "Volume detached for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/a0bd96c2-9ded-4a4b-8b36-d6bcf97135c3-build-blob-cache\") on node \"crc\" DevicePath \"\"" Mar 16 00:31:22 crc kubenswrapper[4816]: I0316 00:31:22.869971 4816 reconciler_common.go:293] "Volume detached for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/a0bd96c2-9ded-4a4b-8b36-d6bcf97135c3-build-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 16 00:31:22 crc kubenswrapper[4816]: I0316 00:31:22.869985 4816 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-fs5z5-pull\" (UniqueName: \"kubernetes.io/secret/a0bd96c2-9ded-4a4b-8b36-d6bcf97135c3-builder-dockercfg-fs5z5-pull\") on node \"crc\" DevicePath \"\"" Mar 16 00:31:22 crc kubenswrapper[4816]: I0316 00:31:22.870002 4816 reconciler_common.go:293] "Volume detached for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/a0bd96c2-9ded-4a4b-8b36-d6bcf97135c3-container-storage-run\") on node \"crc\" DevicePath \"\"" Mar 16 00:31:22 crc kubenswrapper[4816]: I0316 00:31:22.870016 4816 reconciler_common.go:293] "Volume detached for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/a0bd96c2-9ded-4a4b-8b36-d6bcf97135c3-buildworkdir\") on node \"crc\" DevicePath \"\"" Mar 16 00:31:22 crc kubenswrapper[4816]: I0316 00:31:22.870028 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sbrs7\" (UniqueName: \"kubernetes.io/projected/a0bd96c2-9ded-4a4b-8b36-d6bcf97135c3-kube-api-access-sbrs7\") on node \"crc\" DevicePath \"\"" Mar 16 00:31:22 crc kubenswrapper[4816]: I0316 00:31:22.870037 4816 reconciler_common.go:293] "Volume detached for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/a0bd96c2-9ded-4a4b-8b36-d6bcf97135c3-build-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 16 00:31:22 crc kubenswrapper[4816]: I0316 00:31:22.870046 4816 reconciler_common.go:293] "Volume detached for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/a0bd96c2-9ded-4a4b-8b36-d6bcf97135c3-build-system-configs\") on node \"crc\" DevicePath \"\"" Mar 16 00:31:22 crc kubenswrapper[4816]: I0316 00:31:22.870055 4816 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-fs5z5-push\" (UniqueName: \"kubernetes.io/secret/a0bd96c2-9ded-4a4b-8b36-d6bcf97135c3-builder-dockercfg-fs5z5-push\") on node \"crc\" DevicePath \"\"" Mar 16 00:31:23 crc kubenswrapper[4816]: I0316 00:31:23.140127 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a0bd96c2-9ded-4a4b-8b36-d6bcf97135c3-container-storage-root" (OuterVolumeSpecName: "container-storage-root") pod "a0bd96c2-9ded-4a4b-8b36-d6bcf97135c3" (UID: "a0bd96c2-9ded-4a4b-8b36-d6bcf97135c3"). InnerVolumeSpecName "container-storage-root". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 16 00:31:23 crc kubenswrapper[4816]: I0316 00:31:23.173538 4816 reconciler_common.go:293] "Volume detached for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/a0bd96c2-9ded-4a4b-8b36-d6bcf97135c3-container-storage-root\") on node \"crc\" DevicePath \"\"" Mar 16 00:31:23 crc kubenswrapper[4816]: I0316 00:31:23.390221 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_prometheus-webhook-snmp-1-build_a0bd96c2-9ded-4a4b-8b36-d6bcf97135c3/docker-build/0.log" Mar 16 00:31:23 crc kubenswrapper[4816]: I0316 00:31:23.391005 4816 generic.go:334] "Generic (PLEG): container finished" podID="a0bd96c2-9ded-4a4b-8b36-d6bcf97135c3" containerID="d795bb859d1793e48db35fa3999d9de3e107ccefd779c6afab55acb3c338abb7" exitCode=1 Mar 16 00:31:23 crc kubenswrapper[4816]: I0316 00:31:23.391048 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/prometheus-webhook-snmp-1-build" event={"ID":"a0bd96c2-9ded-4a4b-8b36-d6bcf97135c3","Type":"ContainerDied","Data":"d795bb859d1793e48db35fa3999d9de3e107ccefd779c6afab55acb3c338abb7"} Mar 16 00:31:23 crc kubenswrapper[4816]: I0316 00:31:23.391084 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 16 00:31:23 crc kubenswrapper[4816]: I0316 00:31:23.391102 4816 scope.go:117] "RemoveContainer" containerID="d795bb859d1793e48db35fa3999d9de3e107ccefd779c6afab55acb3c338abb7" Mar 16 00:31:23 crc kubenswrapper[4816]: I0316 00:31:23.391089 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/prometheus-webhook-snmp-1-build" event={"ID":"a0bd96c2-9ded-4a4b-8b36-d6bcf97135c3","Type":"ContainerDied","Data":"65c516056523a7bbc848a7d18a23fbbd58a3dce1c2461275ee579b7c5dc75a6c"} Mar 16 00:31:23 crc kubenswrapper[4816]: I0316 00:31:23.427921 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["service-telemetry/prometheus-webhook-snmp-1-build"] Mar 16 00:31:23 crc kubenswrapper[4816]: I0316 00:31:23.431911 4816 scope.go:117] "RemoveContainer" containerID="f5327b247a9795d7266c3a75f38fe58418f7533461eb2ded2bf746b2d1150a78" Mar 16 00:31:23 crc kubenswrapper[4816]: I0316 00:31:23.434626 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["service-telemetry/prometheus-webhook-snmp-1-build"] Mar 16 00:31:23 crc kubenswrapper[4816]: I0316 00:31:23.456858 4816 scope.go:117] "RemoveContainer" containerID="d795bb859d1793e48db35fa3999d9de3e107ccefd779c6afab55acb3c338abb7" Mar 16 00:31:23 crc kubenswrapper[4816]: E0316 00:31:23.457465 4816 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d795bb859d1793e48db35fa3999d9de3e107ccefd779c6afab55acb3c338abb7\": container with ID starting with d795bb859d1793e48db35fa3999d9de3e107ccefd779c6afab55acb3c338abb7 not found: ID does not exist" containerID="d795bb859d1793e48db35fa3999d9de3e107ccefd779c6afab55acb3c338abb7" Mar 16 00:31:23 crc kubenswrapper[4816]: I0316 00:31:23.457513 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d795bb859d1793e48db35fa3999d9de3e107ccefd779c6afab55acb3c338abb7"} err="failed to get container status \"d795bb859d1793e48db35fa3999d9de3e107ccefd779c6afab55acb3c338abb7\": rpc error: code = NotFound desc = could not find container \"d795bb859d1793e48db35fa3999d9de3e107ccefd779c6afab55acb3c338abb7\": container with ID starting with d795bb859d1793e48db35fa3999d9de3e107ccefd779c6afab55acb3c338abb7 not found: ID does not exist" Mar 16 00:31:23 crc kubenswrapper[4816]: I0316 00:31:23.457545 4816 scope.go:117] "RemoveContainer" containerID="f5327b247a9795d7266c3a75f38fe58418f7533461eb2ded2bf746b2d1150a78" Mar 16 00:31:23 crc kubenswrapper[4816]: E0316 00:31:23.458091 4816 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f5327b247a9795d7266c3a75f38fe58418f7533461eb2ded2bf746b2d1150a78\": container with ID starting with f5327b247a9795d7266c3a75f38fe58418f7533461eb2ded2bf746b2d1150a78 not found: ID does not exist" containerID="f5327b247a9795d7266c3a75f38fe58418f7533461eb2ded2bf746b2d1150a78" Mar 16 00:31:23 crc kubenswrapper[4816]: I0316 00:31:23.458142 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f5327b247a9795d7266c3a75f38fe58418f7533461eb2ded2bf746b2d1150a78"} err="failed to get container status \"f5327b247a9795d7266c3a75f38fe58418f7533461eb2ded2bf746b2d1150a78\": rpc error: code = NotFound desc = could not find container \"f5327b247a9795d7266c3a75f38fe58418f7533461eb2ded2bf746b2d1150a78\": container with ID starting with f5327b247a9795d7266c3a75f38fe58418f7533461eb2ded2bf746b2d1150a78 not found: ID does not exist" Mar 16 00:31:23 crc kubenswrapper[4816]: I0316 00:31:23.682086 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0bd96c2-9ded-4a4b-8b36-d6bcf97135c3" path="/var/lib/kubelet/pods/a0bd96c2-9ded-4a4b-8b36-d6bcf97135c3/volumes" Mar 16 00:31:23 crc kubenswrapper[4816]: I0316 00:31:23.978682 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/prometheus-webhook-snmp-2-build"] Mar 16 00:31:23 crc kubenswrapper[4816]: E0316 00:31:23.978887 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a0bd96c2-9ded-4a4b-8b36-d6bcf97135c3" containerName="docker-build" Mar 16 00:31:23 crc kubenswrapper[4816]: I0316 00:31:23.978899 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="a0bd96c2-9ded-4a4b-8b36-d6bcf97135c3" containerName="docker-build" Mar 16 00:31:23 crc kubenswrapper[4816]: E0316 00:31:23.978911 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a0bd96c2-9ded-4a4b-8b36-d6bcf97135c3" containerName="manage-dockerfile" Mar 16 00:31:23 crc kubenswrapper[4816]: I0316 00:31:23.978918 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="a0bd96c2-9ded-4a4b-8b36-d6bcf97135c3" containerName="manage-dockerfile" Mar 16 00:31:23 crc kubenswrapper[4816]: I0316 00:31:23.979015 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="a0bd96c2-9ded-4a4b-8b36-d6bcf97135c3" containerName="docker-build" Mar 16 00:31:23 crc kubenswrapper[4816]: I0316 00:31:23.979751 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 16 00:31:23 crc kubenswrapper[4816]: I0316 00:31:23.981525 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"prometheus-webhook-snmp-2-ca" Mar 16 00:31:23 crc kubenswrapper[4816]: I0316 00:31:23.981730 4816 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"builder-dockercfg-fs5z5" Mar 16 00:31:23 crc kubenswrapper[4816]: I0316 00:31:23.981569 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"prometheus-webhook-snmp-2-global-ca" Mar 16 00:31:23 crc kubenswrapper[4816]: I0316 00:31:23.982885 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"prometheus-webhook-snmp-2-sys-config" Mar 16 00:31:24 crc kubenswrapper[4816]: I0316 00:31:24.002764 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/prometheus-webhook-snmp-2-build"] Mar 16 00:31:24 crc kubenswrapper[4816]: I0316 00:31:24.085314 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/8ee83ac3-283d-44bb-8ad6-e78604301d3a-container-storage-root\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"8ee83ac3-283d-44bb-8ad6-e78604301d3a\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 16 00:31:24 crc kubenswrapper[4816]: I0316 00:31:24.085376 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/8ee83ac3-283d-44bb-8ad6-e78604301d3a-build-ca-bundles\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"8ee83ac3-283d-44bb-8ad6-e78604301d3a\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 16 00:31:24 crc kubenswrapper[4816]: I0316 00:31:24.085403 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/8ee83ac3-283d-44bb-8ad6-e78604301d3a-buildcachedir\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"8ee83ac3-283d-44bb-8ad6-e78604301d3a\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 16 00:31:24 crc kubenswrapper[4816]: I0316 00:31:24.085439 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/8ee83ac3-283d-44bb-8ad6-e78604301d3a-node-pullsecrets\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"8ee83ac3-283d-44bb-8ad6-e78604301d3a\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 16 00:31:24 crc kubenswrapper[4816]: I0316 00:31:24.085462 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/8ee83ac3-283d-44bb-8ad6-e78604301d3a-build-system-configs\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"8ee83ac3-283d-44bb-8ad6-e78604301d3a\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 16 00:31:24 crc kubenswrapper[4816]: I0316 00:31:24.085496 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/8ee83ac3-283d-44bb-8ad6-e78604301d3a-container-storage-run\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"8ee83ac3-283d-44bb-8ad6-e78604301d3a\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 16 00:31:24 crc kubenswrapper[4816]: I0316 00:31:24.085601 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/8ee83ac3-283d-44bb-8ad6-e78604301d3a-buildworkdir\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"8ee83ac3-283d-44bb-8ad6-e78604301d3a\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 16 00:31:24 crc kubenswrapper[4816]: I0316 00:31:24.085626 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/8ee83ac3-283d-44bb-8ad6-e78604301d3a-build-blob-cache\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"8ee83ac3-283d-44bb-8ad6-e78604301d3a\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 16 00:31:24 crc kubenswrapper[4816]: I0316 00:31:24.085660 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-fs5z5-pull\" (UniqueName: \"kubernetes.io/secret/8ee83ac3-283d-44bb-8ad6-e78604301d3a-builder-dockercfg-fs5z5-pull\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"8ee83ac3-283d-44bb-8ad6-e78604301d3a\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 16 00:31:24 crc kubenswrapper[4816]: I0316 00:31:24.085687 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-fs5z5-push\" (UniqueName: \"kubernetes.io/secret/8ee83ac3-283d-44bb-8ad6-e78604301d3a-builder-dockercfg-fs5z5-push\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"8ee83ac3-283d-44bb-8ad6-e78604301d3a\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 16 00:31:24 crc kubenswrapper[4816]: I0316 00:31:24.085754 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jvw6d\" (UniqueName: \"kubernetes.io/projected/8ee83ac3-283d-44bb-8ad6-e78604301d3a-kube-api-access-jvw6d\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"8ee83ac3-283d-44bb-8ad6-e78604301d3a\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 16 00:31:24 crc kubenswrapper[4816]: I0316 00:31:24.085781 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/8ee83ac3-283d-44bb-8ad6-e78604301d3a-build-proxy-ca-bundles\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"8ee83ac3-283d-44bb-8ad6-e78604301d3a\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 16 00:31:24 crc kubenswrapper[4816]: I0316 00:31:24.186575 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/8ee83ac3-283d-44bb-8ad6-e78604301d3a-buildworkdir\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"8ee83ac3-283d-44bb-8ad6-e78604301d3a\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 16 00:31:24 crc kubenswrapper[4816]: I0316 00:31:24.186885 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/8ee83ac3-283d-44bb-8ad6-e78604301d3a-build-blob-cache\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"8ee83ac3-283d-44bb-8ad6-e78604301d3a\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 16 00:31:24 crc kubenswrapper[4816]: I0316 00:31:24.187019 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-fs5z5-pull\" (UniqueName: \"kubernetes.io/secret/8ee83ac3-283d-44bb-8ad6-e78604301d3a-builder-dockercfg-fs5z5-pull\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"8ee83ac3-283d-44bb-8ad6-e78604301d3a\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 16 00:31:24 crc kubenswrapper[4816]: I0316 00:31:24.187163 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-fs5z5-push\" (UniqueName: \"kubernetes.io/secret/8ee83ac3-283d-44bb-8ad6-e78604301d3a-builder-dockercfg-fs5z5-push\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"8ee83ac3-283d-44bb-8ad6-e78604301d3a\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 16 00:31:24 crc kubenswrapper[4816]: I0316 00:31:24.187277 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jvw6d\" (UniqueName: \"kubernetes.io/projected/8ee83ac3-283d-44bb-8ad6-e78604301d3a-kube-api-access-jvw6d\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"8ee83ac3-283d-44bb-8ad6-e78604301d3a\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 16 00:31:24 crc kubenswrapper[4816]: I0316 00:31:24.187400 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/8ee83ac3-283d-44bb-8ad6-e78604301d3a-build-proxy-ca-bundles\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"8ee83ac3-283d-44bb-8ad6-e78604301d3a\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 16 00:31:24 crc kubenswrapper[4816]: I0316 00:31:24.187486 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/8ee83ac3-283d-44bb-8ad6-e78604301d3a-build-blob-cache\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"8ee83ac3-283d-44bb-8ad6-e78604301d3a\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 16 00:31:24 crc kubenswrapper[4816]: I0316 00:31:24.187508 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/8ee83ac3-283d-44bb-8ad6-e78604301d3a-container-storage-root\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"8ee83ac3-283d-44bb-8ad6-e78604301d3a\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 16 00:31:24 crc kubenswrapper[4816]: I0316 00:31:24.187669 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/8ee83ac3-283d-44bb-8ad6-e78604301d3a-build-ca-bundles\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"8ee83ac3-283d-44bb-8ad6-e78604301d3a\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 16 00:31:24 crc kubenswrapper[4816]: I0316 00:31:24.187712 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/8ee83ac3-283d-44bb-8ad6-e78604301d3a-buildcachedir\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"8ee83ac3-283d-44bb-8ad6-e78604301d3a\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 16 00:31:24 crc kubenswrapper[4816]: I0316 00:31:24.187759 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/8ee83ac3-283d-44bb-8ad6-e78604301d3a-node-pullsecrets\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"8ee83ac3-283d-44bb-8ad6-e78604301d3a\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 16 00:31:24 crc kubenswrapper[4816]: I0316 00:31:24.187801 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/8ee83ac3-283d-44bb-8ad6-e78604301d3a-build-system-configs\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"8ee83ac3-283d-44bb-8ad6-e78604301d3a\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 16 00:31:24 crc kubenswrapper[4816]: I0316 00:31:24.187880 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/8ee83ac3-283d-44bb-8ad6-e78604301d3a-container-storage-run\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"8ee83ac3-283d-44bb-8ad6-e78604301d3a\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 16 00:31:24 crc kubenswrapper[4816]: I0316 00:31:24.187921 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/8ee83ac3-283d-44bb-8ad6-e78604301d3a-buildcachedir\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"8ee83ac3-283d-44bb-8ad6-e78604301d3a\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 16 00:31:24 crc kubenswrapper[4816]: I0316 00:31:24.188044 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/8ee83ac3-283d-44bb-8ad6-e78604301d3a-build-proxy-ca-bundles\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"8ee83ac3-283d-44bb-8ad6-e78604301d3a\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 16 00:31:24 crc kubenswrapper[4816]: I0316 00:31:24.188500 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/8ee83ac3-283d-44bb-8ad6-e78604301d3a-container-storage-run\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"8ee83ac3-283d-44bb-8ad6-e78604301d3a\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 16 00:31:24 crc kubenswrapper[4816]: I0316 00:31:24.188643 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/8ee83ac3-283d-44bb-8ad6-e78604301d3a-node-pullsecrets\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"8ee83ac3-283d-44bb-8ad6-e78604301d3a\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 16 00:31:24 crc kubenswrapper[4816]: I0316 00:31:24.188772 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/8ee83ac3-283d-44bb-8ad6-e78604301d3a-container-storage-root\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"8ee83ac3-283d-44bb-8ad6-e78604301d3a\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 16 00:31:24 crc kubenswrapper[4816]: I0316 00:31:24.188882 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/8ee83ac3-283d-44bb-8ad6-e78604301d3a-build-ca-bundles\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"8ee83ac3-283d-44bb-8ad6-e78604301d3a\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 16 00:31:24 crc kubenswrapper[4816]: I0316 00:31:24.188884 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/8ee83ac3-283d-44bb-8ad6-e78604301d3a-build-system-configs\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"8ee83ac3-283d-44bb-8ad6-e78604301d3a\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 16 00:31:24 crc kubenswrapper[4816]: I0316 00:31:24.189322 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/8ee83ac3-283d-44bb-8ad6-e78604301d3a-buildworkdir\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"8ee83ac3-283d-44bb-8ad6-e78604301d3a\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 16 00:31:24 crc kubenswrapper[4816]: I0316 00:31:24.191395 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-fs5z5-pull\" (UniqueName: \"kubernetes.io/secret/8ee83ac3-283d-44bb-8ad6-e78604301d3a-builder-dockercfg-fs5z5-pull\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"8ee83ac3-283d-44bb-8ad6-e78604301d3a\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 16 00:31:24 crc kubenswrapper[4816]: I0316 00:31:24.192648 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-fs5z5-push\" (UniqueName: \"kubernetes.io/secret/8ee83ac3-283d-44bb-8ad6-e78604301d3a-builder-dockercfg-fs5z5-push\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"8ee83ac3-283d-44bb-8ad6-e78604301d3a\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 16 00:31:24 crc kubenswrapper[4816]: I0316 00:31:24.210239 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jvw6d\" (UniqueName: \"kubernetes.io/projected/8ee83ac3-283d-44bb-8ad6-e78604301d3a-kube-api-access-jvw6d\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"8ee83ac3-283d-44bb-8ad6-e78604301d3a\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 16 00:31:24 crc kubenswrapper[4816]: I0316 00:31:24.294657 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 16 00:31:24 crc kubenswrapper[4816]: I0316 00:31:24.743409 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/prometheus-webhook-snmp-2-build"] Mar 16 00:31:25 crc kubenswrapper[4816]: I0316 00:31:25.410121 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/prometheus-webhook-snmp-2-build" event={"ID":"8ee83ac3-283d-44bb-8ad6-e78604301d3a","Type":"ContainerStarted","Data":"bfc7e6a557fdaa90a0e926ad8cf423d7efa51c6fbbded9d245dc93674dc1ace2"} Mar 16 00:31:25 crc kubenswrapper[4816]: I0316 00:31:25.410510 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/prometheus-webhook-snmp-2-build" event={"ID":"8ee83ac3-283d-44bb-8ad6-e78604301d3a","Type":"ContainerStarted","Data":"0113d4b80ed0b8fe7937cde689e4ea8e705fc8e36ad867715f3142b9de604104"} Mar 16 00:31:26 crc kubenswrapper[4816]: I0316 00:31:26.417274 4816 generic.go:334] "Generic (PLEG): container finished" podID="8ee83ac3-283d-44bb-8ad6-e78604301d3a" containerID="bfc7e6a557fdaa90a0e926ad8cf423d7efa51c6fbbded9d245dc93674dc1ace2" exitCode=0 Mar 16 00:31:26 crc kubenswrapper[4816]: I0316 00:31:26.417333 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/prometheus-webhook-snmp-2-build" event={"ID":"8ee83ac3-283d-44bb-8ad6-e78604301d3a","Type":"ContainerDied","Data":"bfc7e6a557fdaa90a0e926ad8cf423d7efa51c6fbbded9d245dc93674dc1ace2"} Mar 16 00:31:27 crc kubenswrapper[4816]: I0316 00:31:27.428383 4816 generic.go:334] "Generic (PLEG): container finished" podID="8ee83ac3-283d-44bb-8ad6-e78604301d3a" containerID="e8de89caf63a1038e2ee8bd7df762a30c32b6504dafc4a5c1371cd611f17f793" exitCode=0 Mar 16 00:31:27 crc kubenswrapper[4816]: I0316 00:31:27.428463 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/prometheus-webhook-snmp-2-build" event={"ID":"8ee83ac3-283d-44bb-8ad6-e78604301d3a","Type":"ContainerDied","Data":"e8de89caf63a1038e2ee8bd7df762a30c32b6504dafc4a5c1371cd611f17f793"} Mar 16 00:31:27 crc kubenswrapper[4816]: I0316 00:31:27.470724 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_prometheus-webhook-snmp-2-build_8ee83ac3-283d-44bb-8ad6-e78604301d3a/manage-dockerfile/0.log" Mar 16 00:31:28 crc kubenswrapper[4816]: I0316 00:31:28.442221 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/prometheus-webhook-snmp-2-build" event={"ID":"8ee83ac3-283d-44bb-8ad6-e78604301d3a","Type":"ContainerStarted","Data":"0902f30878c51e4f3236bd867ace5550ff04114b09d7f0926101a8f73cf8cc0d"} Mar 16 00:32:00 crc kubenswrapper[4816]: I0316 00:32:00.127617 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/prometheus-webhook-snmp-2-build" podStartSLOduration=37.127597176 podStartE2EDuration="37.127597176s" podCreationTimestamp="2026-03-16 00:31:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-16 00:31:28.470767718 +0000 UTC m=+1481.567067701" watchObservedRunningTime="2026-03-16 00:32:00.127597176 +0000 UTC m=+1513.223897129" Mar 16 00:32:00 crc kubenswrapper[4816]: I0316 00:32:00.135287 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29560352-4c2cj"] Mar 16 00:32:00 crc kubenswrapper[4816]: I0316 00:32:00.136270 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29560352-4c2cj" Mar 16 00:32:00 crc kubenswrapper[4816]: I0316 00:32:00.138220 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 16 00:32:00 crc kubenswrapper[4816]: I0316 00:32:00.140856 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29560352-4c2cj"] Mar 16 00:32:00 crc kubenswrapper[4816]: I0316 00:32:00.140978 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 16 00:32:00 crc kubenswrapper[4816]: I0316 00:32:00.142773 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-8hc2r" Mar 16 00:32:00 crc kubenswrapper[4816]: I0316 00:32:00.215672 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qdzr7\" (UniqueName: \"kubernetes.io/projected/cee5a2cc-4256-43fb-9517-83533a5acf29-kube-api-access-qdzr7\") pod \"auto-csr-approver-29560352-4c2cj\" (UID: \"cee5a2cc-4256-43fb-9517-83533a5acf29\") " pod="openshift-infra/auto-csr-approver-29560352-4c2cj" Mar 16 00:32:00 crc kubenswrapper[4816]: I0316 00:32:00.317056 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qdzr7\" (UniqueName: \"kubernetes.io/projected/cee5a2cc-4256-43fb-9517-83533a5acf29-kube-api-access-qdzr7\") pod \"auto-csr-approver-29560352-4c2cj\" (UID: \"cee5a2cc-4256-43fb-9517-83533a5acf29\") " pod="openshift-infra/auto-csr-approver-29560352-4c2cj" Mar 16 00:32:00 crc kubenswrapper[4816]: I0316 00:32:00.338700 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qdzr7\" (UniqueName: \"kubernetes.io/projected/cee5a2cc-4256-43fb-9517-83533a5acf29-kube-api-access-qdzr7\") pod \"auto-csr-approver-29560352-4c2cj\" (UID: \"cee5a2cc-4256-43fb-9517-83533a5acf29\") " pod="openshift-infra/auto-csr-approver-29560352-4c2cj" Mar 16 00:32:00 crc kubenswrapper[4816]: I0316 00:32:00.460356 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29560352-4c2cj" Mar 16 00:32:00 crc kubenswrapper[4816]: I0316 00:32:00.734240 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29560352-4c2cj"] Mar 16 00:32:01 crc kubenswrapper[4816]: I0316 00:32:01.674363 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29560352-4c2cj" event={"ID":"cee5a2cc-4256-43fb-9517-83533a5acf29","Type":"ContainerStarted","Data":"846e84f9793585d6acd707d0990c6f9bf7f849e2e38887a2322410f6a6e52271"} Mar 16 00:32:01 crc kubenswrapper[4816]: I0316 00:32:01.863584 4816 patch_prober.go:28] interesting pod/machine-config-daemon-jrdcz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 16 00:32:01 crc kubenswrapper[4816]: I0316 00:32:01.863641 4816 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jrdcz" podUID="dd08ece2-7636-4966-973a-e96a34b70b53" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 16 00:32:02 crc kubenswrapper[4816]: I0316 00:32:02.679477 4816 generic.go:334] "Generic (PLEG): container finished" podID="cee5a2cc-4256-43fb-9517-83533a5acf29" containerID="2169e8fca36c31b741a4793cc4a50c325f1ec3d6141a69fbe357f1c522080d5b" exitCode=0 Mar 16 00:32:02 crc kubenswrapper[4816]: I0316 00:32:02.679591 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29560352-4c2cj" event={"ID":"cee5a2cc-4256-43fb-9517-83533a5acf29","Type":"ContainerDied","Data":"2169e8fca36c31b741a4793cc4a50c325f1ec3d6141a69fbe357f1c522080d5b"} Mar 16 00:32:03 crc kubenswrapper[4816]: I0316 00:32:03.923977 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29560352-4c2cj" Mar 16 00:32:04 crc kubenswrapper[4816]: I0316 00:32:04.066798 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qdzr7\" (UniqueName: \"kubernetes.io/projected/cee5a2cc-4256-43fb-9517-83533a5acf29-kube-api-access-qdzr7\") pod \"cee5a2cc-4256-43fb-9517-83533a5acf29\" (UID: \"cee5a2cc-4256-43fb-9517-83533a5acf29\") " Mar 16 00:32:04 crc kubenswrapper[4816]: I0316 00:32:04.072981 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cee5a2cc-4256-43fb-9517-83533a5acf29-kube-api-access-qdzr7" (OuterVolumeSpecName: "kube-api-access-qdzr7") pod "cee5a2cc-4256-43fb-9517-83533a5acf29" (UID: "cee5a2cc-4256-43fb-9517-83533a5acf29"). InnerVolumeSpecName "kube-api-access-qdzr7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:32:04 crc kubenswrapper[4816]: I0316 00:32:04.168397 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qdzr7\" (UniqueName: \"kubernetes.io/projected/cee5a2cc-4256-43fb-9517-83533a5acf29-kube-api-access-qdzr7\") on node \"crc\" DevicePath \"\"" Mar 16 00:32:04 crc kubenswrapper[4816]: I0316 00:32:04.694344 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29560352-4c2cj" event={"ID":"cee5a2cc-4256-43fb-9517-83533a5acf29","Type":"ContainerDied","Data":"846e84f9793585d6acd707d0990c6f9bf7f849e2e38887a2322410f6a6e52271"} Mar 16 00:32:04 crc kubenswrapper[4816]: I0316 00:32:04.694625 4816 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="846e84f9793585d6acd707d0990c6f9bf7f849e2e38887a2322410f6a6e52271" Mar 16 00:32:04 crc kubenswrapper[4816]: I0316 00:32:04.694422 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29560352-4c2cj" Mar 16 00:32:04 crc kubenswrapper[4816]: I0316 00:32:04.985156 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29560346-hjpvk"] Mar 16 00:32:04 crc kubenswrapper[4816]: I0316 00:32:04.992429 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29560346-hjpvk"] Mar 16 00:32:05 crc kubenswrapper[4816]: I0316 00:32:05.676432 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2942e78f-05b7-486f-bee0-93a942f80d8a" path="/var/lib/kubelet/pods/2942e78f-05b7-486f-bee0-93a942f80d8a/volumes" Mar 16 00:32:18 crc kubenswrapper[4816]: I0316 00:32:18.790825 4816 generic.go:334] "Generic (PLEG): container finished" podID="8ee83ac3-283d-44bb-8ad6-e78604301d3a" containerID="0902f30878c51e4f3236bd867ace5550ff04114b09d7f0926101a8f73cf8cc0d" exitCode=0 Mar 16 00:32:18 crc kubenswrapper[4816]: I0316 00:32:18.790935 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/prometheus-webhook-snmp-2-build" event={"ID":"8ee83ac3-283d-44bb-8ad6-e78604301d3a","Type":"ContainerDied","Data":"0902f30878c51e4f3236bd867ace5550ff04114b09d7f0926101a8f73cf8cc0d"} Mar 16 00:32:20 crc kubenswrapper[4816]: I0316 00:32:20.085603 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 16 00:32:20 crc kubenswrapper[4816]: I0316 00:32:20.190067 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jvw6d\" (UniqueName: \"kubernetes.io/projected/8ee83ac3-283d-44bb-8ad6-e78604301d3a-kube-api-access-jvw6d\") pod \"8ee83ac3-283d-44bb-8ad6-e78604301d3a\" (UID: \"8ee83ac3-283d-44bb-8ad6-e78604301d3a\") " Mar 16 00:32:20 crc kubenswrapper[4816]: I0316 00:32:20.190105 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/8ee83ac3-283d-44bb-8ad6-e78604301d3a-buildcachedir\") pod \"8ee83ac3-283d-44bb-8ad6-e78604301d3a\" (UID: \"8ee83ac3-283d-44bb-8ad6-e78604301d3a\") " Mar 16 00:32:20 crc kubenswrapper[4816]: I0316 00:32:20.190150 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-fs5z5-push\" (UniqueName: \"kubernetes.io/secret/8ee83ac3-283d-44bb-8ad6-e78604301d3a-builder-dockercfg-fs5z5-push\") pod \"8ee83ac3-283d-44bb-8ad6-e78604301d3a\" (UID: \"8ee83ac3-283d-44bb-8ad6-e78604301d3a\") " Mar 16 00:32:20 crc kubenswrapper[4816]: I0316 00:32:20.190192 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/8ee83ac3-283d-44bb-8ad6-e78604301d3a-build-system-configs\") pod \"8ee83ac3-283d-44bb-8ad6-e78604301d3a\" (UID: \"8ee83ac3-283d-44bb-8ad6-e78604301d3a\") " Mar 16 00:32:20 crc kubenswrapper[4816]: I0316 00:32:20.190223 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/8ee83ac3-283d-44bb-8ad6-e78604301d3a-build-blob-cache\") pod \"8ee83ac3-283d-44bb-8ad6-e78604301d3a\" (UID: \"8ee83ac3-283d-44bb-8ad6-e78604301d3a\") " Mar 16 00:32:20 crc kubenswrapper[4816]: I0316 00:32:20.190226 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8ee83ac3-283d-44bb-8ad6-e78604301d3a-buildcachedir" (OuterVolumeSpecName: "buildcachedir") pod "8ee83ac3-283d-44bb-8ad6-e78604301d3a" (UID: "8ee83ac3-283d-44bb-8ad6-e78604301d3a"). InnerVolumeSpecName "buildcachedir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 16 00:32:20 crc kubenswrapper[4816]: I0316 00:32:20.190260 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/8ee83ac3-283d-44bb-8ad6-e78604301d3a-build-proxy-ca-bundles\") pod \"8ee83ac3-283d-44bb-8ad6-e78604301d3a\" (UID: \"8ee83ac3-283d-44bb-8ad6-e78604301d3a\") " Mar 16 00:32:20 crc kubenswrapper[4816]: I0316 00:32:20.190285 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/8ee83ac3-283d-44bb-8ad6-e78604301d3a-build-ca-bundles\") pod \"8ee83ac3-283d-44bb-8ad6-e78604301d3a\" (UID: \"8ee83ac3-283d-44bb-8ad6-e78604301d3a\") " Mar 16 00:32:20 crc kubenswrapper[4816]: I0316 00:32:20.190310 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-fs5z5-pull\" (UniqueName: \"kubernetes.io/secret/8ee83ac3-283d-44bb-8ad6-e78604301d3a-builder-dockercfg-fs5z5-pull\") pod \"8ee83ac3-283d-44bb-8ad6-e78604301d3a\" (UID: \"8ee83ac3-283d-44bb-8ad6-e78604301d3a\") " Mar 16 00:32:20 crc kubenswrapper[4816]: I0316 00:32:20.190336 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/8ee83ac3-283d-44bb-8ad6-e78604301d3a-node-pullsecrets\") pod \"8ee83ac3-283d-44bb-8ad6-e78604301d3a\" (UID: \"8ee83ac3-283d-44bb-8ad6-e78604301d3a\") " Mar 16 00:32:20 crc kubenswrapper[4816]: I0316 00:32:20.190369 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/8ee83ac3-283d-44bb-8ad6-e78604301d3a-container-storage-root\") pod \"8ee83ac3-283d-44bb-8ad6-e78604301d3a\" (UID: \"8ee83ac3-283d-44bb-8ad6-e78604301d3a\") " Mar 16 00:32:20 crc kubenswrapper[4816]: I0316 00:32:20.190396 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8ee83ac3-283d-44bb-8ad6-e78604301d3a-node-pullsecrets" (OuterVolumeSpecName: "node-pullsecrets") pod "8ee83ac3-283d-44bb-8ad6-e78604301d3a" (UID: "8ee83ac3-283d-44bb-8ad6-e78604301d3a"). InnerVolumeSpecName "node-pullsecrets". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 16 00:32:20 crc kubenswrapper[4816]: I0316 00:32:20.190419 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/8ee83ac3-283d-44bb-8ad6-e78604301d3a-container-storage-run\") pod \"8ee83ac3-283d-44bb-8ad6-e78604301d3a\" (UID: \"8ee83ac3-283d-44bb-8ad6-e78604301d3a\") " Mar 16 00:32:20 crc kubenswrapper[4816]: I0316 00:32:20.190460 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/8ee83ac3-283d-44bb-8ad6-e78604301d3a-buildworkdir\") pod \"8ee83ac3-283d-44bb-8ad6-e78604301d3a\" (UID: \"8ee83ac3-283d-44bb-8ad6-e78604301d3a\") " Mar 16 00:32:20 crc kubenswrapper[4816]: I0316 00:32:20.190729 4816 reconciler_common.go:293] "Volume detached for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/8ee83ac3-283d-44bb-8ad6-e78604301d3a-node-pullsecrets\") on node \"crc\" DevicePath \"\"" Mar 16 00:32:20 crc kubenswrapper[4816]: I0316 00:32:20.190746 4816 reconciler_common.go:293] "Volume detached for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/8ee83ac3-283d-44bb-8ad6-e78604301d3a-buildcachedir\") on node \"crc\" DevicePath \"\"" Mar 16 00:32:20 crc kubenswrapper[4816]: I0316 00:32:20.190987 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8ee83ac3-283d-44bb-8ad6-e78604301d3a-build-system-configs" (OuterVolumeSpecName: "build-system-configs") pod "8ee83ac3-283d-44bb-8ad6-e78604301d3a" (UID: "8ee83ac3-283d-44bb-8ad6-e78604301d3a"). InnerVolumeSpecName "build-system-configs". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:32:20 crc kubenswrapper[4816]: I0316 00:32:20.191355 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8ee83ac3-283d-44bb-8ad6-e78604301d3a-container-storage-run" (OuterVolumeSpecName: "container-storage-run") pod "8ee83ac3-283d-44bb-8ad6-e78604301d3a" (UID: "8ee83ac3-283d-44bb-8ad6-e78604301d3a"). InnerVolumeSpecName "container-storage-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 16 00:32:20 crc kubenswrapper[4816]: I0316 00:32:20.192434 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8ee83ac3-283d-44bb-8ad6-e78604301d3a-build-proxy-ca-bundles" (OuterVolumeSpecName: "build-proxy-ca-bundles") pod "8ee83ac3-283d-44bb-8ad6-e78604301d3a" (UID: "8ee83ac3-283d-44bb-8ad6-e78604301d3a"). InnerVolumeSpecName "build-proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:32:20 crc kubenswrapper[4816]: I0316 00:32:20.192532 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8ee83ac3-283d-44bb-8ad6-e78604301d3a-build-ca-bundles" (OuterVolumeSpecName: "build-ca-bundles") pod "8ee83ac3-283d-44bb-8ad6-e78604301d3a" (UID: "8ee83ac3-283d-44bb-8ad6-e78604301d3a"). InnerVolumeSpecName "build-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:32:20 crc kubenswrapper[4816]: I0316 00:32:20.195610 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8ee83ac3-283d-44bb-8ad6-e78604301d3a-buildworkdir" (OuterVolumeSpecName: "buildworkdir") pod "8ee83ac3-283d-44bb-8ad6-e78604301d3a" (UID: "8ee83ac3-283d-44bb-8ad6-e78604301d3a"). InnerVolumeSpecName "buildworkdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 16 00:32:20 crc kubenswrapper[4816]: I0316 00:32:20.195891 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8ee83ac3-283d-44bb-8ad6-e78604301d3a-builder-dockercfg-fs5z5-pull" (OuterVolumeSpecName: "builder-dockercfg-fs5z5-pull") pod "8ee83ac3-283d-44bb-8ad6-e78604301d3a" (UID: "8ee83ac3-283d-44bb-8ad6-e78604301d3a"). InnerVolumeSpecName "builder-dockercfg-fs5z5-pull". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 16 00:32:20 crc kubenswrapper[4816]: I0316 00:32:20.196508 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8ee83ac3-283d-44bb-8ad6-e78604301d3a-kube-api-access-jvw6d" (OuterVolumeSpecName: "kube-api-access-jvw6d") pod "8ee83ac3-283d-44bb-8ad6-e78604301d3a" (UID: "8ee83ac3-283d-44bb-8ad6-e78604301d3a"). InnerVolumeSpecName "kube-api-access-jvw6d". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:32:20 crc kubenswrapper[4816]: I0316 00:32:20.202714 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8ee83ac3-283d-44bb-8ad6-e78604301d3a-builder-dockercfg-fs5z5-push" (OuterVolumeSpecName: "builder-dockercfg-fs5z5-push") pod "8ee83ac3-283d-44bb-8ad6-e78604301d3a" (UID: "8ee83ac3-283d-44bb-8ad6-e78604301d3a"). InnerVolumeSpecName "builder-dockercfg-fs5z5-push". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 16 00:32:20 crc kubenswrapper[4816]: I0316 00:32:20.283328 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8ee83ac3-283d-44bb-8ad6-e78604301d3a-build-blob-cache" (OuterVolumeSpecName: "build-blob-cache") pod "8ee83ac3-283d-44bb-8ad6-e78604301d3a" (UID: "8ee83ac3-283d-44bb-8ad6-e78604301d3a"). InnerVolumeSpecName "build-blob-cache". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 16 00:32:20 crc kubenswrapper[4816]: I0316 00:32:20.292133 4816 reconciler_common.go:293] "Volume detached for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/8ee83ac3-283d-44bb-8ad6-e78604301d3a-build-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 16 00:32:20 crc kubenswrapper[4816]: I0316 00:32:20.292170 4816 reconciler_common.go:293] "Volume detached for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/8ee83ac3-283d-44bb-8ad6-e78604301d3a-build-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 16 00:32:20 crc kubenswrapper[4816]: I0316 00:32:20.292180 4816 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-fs5z5-pull\" (UniqueName: \"kubernetes.io/secret/8ee83ac3-283d-44bb-8ad6-e78604301d3a-builder-dockercfg-fs5z5-pull\") on node \"crc\" DevicePath \"\"" Mar 16 00:32:20 crc kubenswrapper[4816]: I0316 00:32:20.292190 4816 reconciler_common.go:293] "Volume detached for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/8ee83ac3-283d-44bb-8ad6-e78604301d3a-container-storage-run\") on node \"crc\" DevicePath \"\"" Mar 16 00:32:20 crc kubenswrapper[4816]: I0316 00:32:20.292200 4816 reconciler_common.go:293] "Volume detached for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/8ee83ac3-283d-44bb-8ad6-e78604301d3a-buildworkdir\") on node \"crc\" DevicePath \"\"" Mar 16 00:32:20 crc kubenswrapper[4816]: I0316 00:32:20.292209 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jvw6d\" (UniqueName: \"kubernetes.io/projected/8ee83ac3-283d-44bb-8ad6-e78604301d3a-kube-api-access-jvw6d\") on node \"crc\" DevicePath \"\"" Mar 16 00:32:20 crc kubenswrapper[4816]: I0316 00:32:20.292217 4816 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-fs5z5-push\" (UniqueName: \"kubernetes.io/secret/8ee83ac3-283d-44bb-8ad6-e78604301d3a-builder-dockercfg-fs5z5-push\") on node \"crc\" DevicePath \"\"" Mar 16 00:32:20 crc kubenswrapper[4816]: I0316 00:32:20.292226 4816 reconciler_common.go:293] "Volume detached for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/8ee83ac3-283d-44bb-8ad6-e78604301d3a-build-system-configs\") on node \"crc\" DevicePath \"\"" Mar 16 00:32:20 crc kubenswrapper[4816]: I0316 00:32:20.292235 4816 reconciler_common.go:293] "Volume detached for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/8ee83ac3-283d-44bb-8ad6-e78604301d3a-build-blob-cache\") on node \"crc\" DevicePath \"\"" Mar 16 00:32:20 crc kubenswrapper[4816]: I0316 00:32:20.807838 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/prometheus-webhook-snmp-2-build" event={"ID":"8ee83ac3-283d-44bb-8ad6-e78604301d3a","Type":"ContainerDied","Data":"0113d4b80ed0b8fe7937cde689e4ea8e705fc8e36ad867715f3142b9de604104"} Mar 16 00:32:20 crc kubenswrapper[4816]: I0316 00:32:20.808071 4816 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0113d4b80ed0b8fe7937cde689e4ea8e705fc8e36ad867715f3142b9de604104" Mar 16 00:32:20 crc kubenswrapper[4816]: I0316 00:32:20.807910 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 16 00:32:21 crc kubenswrapper[4816]: I0316 00:32:21.013337 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-tmfj2"] Mar 16 00:32:21 crc kubenswrapper[4816]: E0316 00:32:21.013862 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cee5a2cc-4256-43fb-9517-83533a5acf29" containerName="oc" Mar 16 00:32:21 crc kubenswrapper[4816]: I0316 00:32:21.013902 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="cee5a2cc-4256-43fb-9517-83533a5acf29" containerName="oc" Mar 16 00:32:21 crc kubenswrapper[4816]: E0316 00:32:21.013929 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8ee83ac3-283d-44bb-8ad6-e78604301d3a" containerName="manage-dockerfile" Mar 16 00:32:21 crc kubenswrapper[4816]: I0316 00:32:21.013944 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="8ee83ac3-283d-44bb-8ad6-e78604301d3a" containerName="manage-dockerfile" Mar 16 00:32:21 crc kubenswrapper[4816]: E0316 00:32:21.013971 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8ee83ac3-283d-44bb-8ad6-e78604301d3a" containerName="docker-build" Mar 16 00:32:21 crc kubenswrapper[4816]: I0316 00:32:21.013985 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="8ee83ac3-283d-44bb-8ad6-e78604301d3a" containerName="docker-build" Mar 16 00:32:21 crc kubenswrapper[4816]: E0316 00:32:21.014007 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8ee83ac3-283d-44bb-8ad6-e78604301d3a" containerName="git-clone" Mar 16 00:32:21 crc kubenswrapper[4816]: I0316 00:32:21.014020 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="8ee83ac3-283d-44bb-8ad6-e78604301d3a" containerName="git-clone" Mar 16 00:32:21 crc kubenswrapper[4816]: I0316 00:32:21.014246 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="cee5a2cc-4256-43fb-9517-83533a5acf29" containerName="oc" Mar 16 00:32:21 crc kubenswrapper[4816]: I0316 00:32:21.014282 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="8ee83ac3-283d-44bb-8ad6-e78604301d3a" containerName="docker-build" Mar 16 00:32:21 crc kubenswrapper[4816]: I0316 00:32:21.015901 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-tmfj2" Mar 16 00:32:21 crc kubenswrapper[4816]: I0316 00:32:21.025201 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-tmfj2"] Mar 16 00:32:21 crc kubenswrapper[4816]: I0316 00:32:21.102909 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/05447e0c-cfec-4548-a367-b4058cd9ee40-utilities\") pod \"redhat-operators-tmfj2\" (UID: \"05447e0c-cfec-4548-a367-b4058cd9ee40\") " pod="openshift-marketplace/redhat-operators-tmfj2" Mar 16 00:32:21 crc kubenswrapper[4816]: I0316 00:32:21.102980 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/05447e0c-cfec-4548-a367-b4058cd9ee40-catalog-content\") pod \"redhat-operators-tmfj2\" (UID: \"05447e0c-cfec-4548-a367-b4058cd9ee40\") " pod="openshift-marketplace/redhat-operators-tmfj2" Mar 16 00:32:21 crc kubenswrapper[4816]: I0316 00:32:21.103012 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9bmlt\" (UniqueName: \"kubernetes.io/projected/05447e0c-cfec-4548-a367-b4058cd9ee40-kube-api-access-9bmlt\") pod \"redhat-operators-tmfj2\" (UID: \"05447e0c-cfec-4548-a367-b4058cd9ee40\") " pod="openshift-marketplace/redhat-operators-tmfj2" Mar 16 00:32:21 crc kubenswrapper[4816]: I0316 00:32:21.199995 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8ee83ac3-283d-44bb-8ad6-e78604301d3a-container-storage-root" (OuterVolumeSpecName: "container-storage-root") pod "8ee83ac3-283d-44bb-8ad6-e78604301d3a" (UID: "8ee83ac3-283d-44bb-8ad6-e78604301d3a"). InnerVolumeSpecName "container-storage-root". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 16 00:32:21 crc kubenswrapper[4816]: I0316 00:32:21.204771 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/05447e0c-cfec-4548-a367-b4058cd9ee40-catalog-content\") pod \"redhat-operators-tmfj2\" (UID: \"05447e0c-cfec-4548-a367-b4058cd9ee40\") " pod="openshift-marketplace/redhat-operators-tmfj2" Mar 16 00:32:21 crc kubenswrapper[4816]: I0316 00:32:21.204817 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9bmlt\" (UniqueName: \"kubernetes.io/projected/05447e0c-cfec-4548-a367-b4058cd9ee40-kube-api-access-9bmlt\") pod \"redhat-operators-tmfj2\" (UID: \"05447e0c-cfec-4548-a367-b4058cd9ee40\") " pod="openshift-marketplace/redhat-operators-tmfj2" Mar 16 00:32:21 crc kubenswrapper[4816]: I0316 00:32:21.204874 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/05447e0c-cfec-4548-a367-b4058cd9ee40-utilities\") pod \"redhat-operators-tmfj2\" (UID: \"05447e0c-cfec-4548-a367-b4058cd9ee40\") " pod="openshift-marketplace/redhat-operators-tmfj2" Mar 16 00:32:21 crc kubenswrapper[4816]: I0316 00:32:21.204919 4816 reconciler_common.go:293] "Volume detached for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/8ee83ac3-283d-44bb-8ad6-e78604301d3a-container-storage-root\") on node \"crc\" DevicePath \"\"" Mar 16 00:32:21 crc kubenswrapper[4816]: I0316 00:32:21.205307 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/05447e0c-cfec-4548-a367-b4058cd9ee40-utilities\") pod \"redhat-operators-tmfj2\" (UID: \"05447e0c-cfec-4548-a367-b4058cd9ee40\") " pod="openshift-marketplace/redhat-operators-tmfj2" Mar 16 00:32:21 crc kubenswrapper[4816]: I0316 00:32:21.205371 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/05447e0c-cfec-4548-a367-b4058cd9ee40-catalog-content\") pod \"redhat-operators-tmfj2\" (UID: \"05447e0c-cfec-4548-a367-b4058cd9ee40\") " pod="openshift-marketplace/redhat-operators-tmfj2" Mar 16 00:32:21 crc kubenswrapper[4816]: I0316 00:32:21.226319 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9bmlt\" (UniqueName: \"kubernetes.io/projected/05447e0c-cfec-4548-a367-b4058cd9ee40-kube-api-access-9bmlt\") pod \"redhat-operators-tmfj2\" (UID: \"05447e0c-cfec-4548-a367-b4058cd9ee40\") " pod="openshift-marketplace/redhat-operators-tmfj2" Mar 16 00:32:21 crc kubenswrapper[4816]: I0316 00:32:21.383387 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-tmfj2" Mar 16 00:32:21 crc kubenswrapper[4816]: I0316 00:32:21.834278 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-tmfj2"] Mar 16 00:32:22 crc kubenswrapper[4816]: I0316 00:32:22.830319 4816 generic.go:334] "Generic (PLEG): container finished" podID="05447e0c-cfec-4548-a367-b4058cd9ee40" containerID="25145667416f108acf66af8cb8050c2e3a99db03759d435f751fb8c2664712c6" exitCode=0 Mar 16 00:32:22 crc kubenswrapper[4816]: I0316 00:32:22.830405 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tmfj2" event={"ID":"05447e0c-cfec-4548-a367-b4058cd9ee40","Type":"ContainerDied","Data":"25145667416f108acf66af8cb8050c2e3a99db03759d435f751fb8c2664712c6"} Mar 16 00:32:22 crc kubenswrapper[4816]: I0316 00:32:22.830659 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tmfj2" event={"ID":"05447e0c-cfec-4548-a367-b4058cd9ee40","Type":"ContainerStarted","Data":"788e7b9c485e2da4d4c729d6018e362f67e9125881eaf7b3347cf2e96230957c"} Mar 16 00:32:23 crc kubenswrapper[4816]: I0316 00:32:23.839402 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tmfj2" event={"ID":"05447e0c-cfec-4548-a367-b4058cd9ee40","Type":"ContainerStarted","Data":"b6e0ddce6965ab602acc7ac1b54b27f26e5eabfe05171bdd286f5ad548b30ea8"} Mar 16 00:32:24 crc kubenswrapper[4816]: I0316 00:32:24.848138 4816 generic.go:334] "Generic (PLEG): container finished" podID="05447e0c-cfec-4548-a367-b4058cd9ee40" containerID="b6e0ddce6965ab602acc7ac1b54b27f26e5eabfe05171bdd286f5ad548b30ea8" exitCode=0 Mar 16 00:32:24 crc kubenswrapper[4816]: I0316 00:32:24.848196 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tmfj2" event={"ID":"05447e0c-cfec-4548-a367-b4058cd9ee40","Type":"ContainerDied","Data":"b6e0ddce6965ab602acc7ac1b54b27f26e5eabfe05171bdd286f5ad548b30ea8"} Mar 16 00:32:25 crc kubenswrapper[4816]: I0316 00:32:25.856671 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tmfj2" event={"ID":"05447e0c-cfec-4548-a367-b4058cd9ee40","Type":"ContainerStarted","Data":"ad40cf381627305d1b58213b1985a792b55e9459828e9e50b09a60ed1eb6cd2a"} Mar 16 00:32:25 crc kubenswrapper[4816]: I0316 00:32:25.881774 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-tmfj2" podStartSLOduration=3.461727266 podStartE2EDuration="5.881749662s" podCreationTimestamp="2026-03-16 00:32:20 +0000 UTC" firstStartedPulling="2026-03-16 00:32:22.834850755 +0000 UTC m=+1535.931150728" lastFinishedPulling="2026-03-16 00:32:25.254873171 +0000 UTC m=+1538.351173124" observedRunningTime="2026-03-16 00:32:25.880195358 +0000 UTC m=+1538.976495351" watchObservedRunningTime="2026-03-16 00:32:25.881749662 +0000 UTC m=+1538.978049635" Mar 16 00:32:29 crc kubenswrapper[4816]: I0316 00:32:29.434454 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/service-telemetry-operator-bundle-1-build"] Mar 16 00:32:29 crc kubenswrapper[4816]: I0316 00:32:29.435968 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-operator-bundle-1-build" Mar 16 00:32:29 crc kubenswrapper[4816]: I0316 00:32:29.437476 4816 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"builder-dockercfg-fs5z5" Mar 16 00:32:29 crc kubenswrapper[4816]: I0316 00:32:29.437828 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"service-telemetry-operator-bundle-1-ca" Mar 16 00:32:29 crc kubenswrapper[4816]: I0316 00:32:29.439441 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"service-telemetry-operator-bundle-1-global-ca" Mar 16 00:32:29 crc kubenswrapper[4816]: I0316 00:32:29.439442 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"service-telemetry-operator-bundle-1-sys-config" Mar 16 00:32:29 crc kubenswrapper[4816]: I0316 00:32:29.452733 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/service-telemetry-operator-bundle-1-build"] Mar 16 00:32:29 crc kubenswrapper[4816]: I0316 00:32:29.519722 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/38d77c46-58bc-4dd3-a874-85e5b14c1585-container-storage-root\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"38d77c46-58bc-4dd3-a874-85e5b14c1585\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Mar 16 00:32:29 crc kubenswrapper[4816]: I0316 00:32:29.519771 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/38d77c46-58bc-4dd3-a874-85e5b14c1585-build-system-configs\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"38d77c46-58bc-4dd3-a874-85e5b14c1585\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Mar 16 00:32:29 crc kubenswrapper[4816]: I0316 00:32:29.519803 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/38d77c46-58bc-4dd3-a874-85e5b14c1585-buildcachedir\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"38d77c46-58bc-4dd3-a874-85e5b14c1585\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Mar 16 00:32:29 crc kubenswrapper[4816]: I0316 00:32:29.519855 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-fs5z5-push\" (UniqueName: \"kubernetes.io/secret/38d77c46-58bc-4dd3-a874-85e5b14c1585-builder-dockercfg-fs5z5-push\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"38d77c46-58bc-4dd3-a874-85e5b14c1585\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Mar 16 00:32:29 crc kubenswrapper[4816]: I0316 00:32:29.519917 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/38d77c46-58bc-4dd3-a874-85e5b14c1585-build-proxy-ca-bundles\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"38d77c46-58bc-4dd3-a874-85e5b14c1585\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Mar 16 00:32:29 crc kubenswrapper[4816]: I0316 00:32:29.519941 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/38d77c46-58bc-4dd3-a874-85e5b14c1585-build-blob-cache\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"38d77c46-58bc-4dd3-a874-85e5b14c1585\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Mar 16 00:32:29 crc kubenswrapper[4816]: I0316 00:32:29.519956 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vftwd\" (UniqueName: \"kubernetes.io/projected/38d77c46-58bc-4dd3-a874-85e5b14c1585-kube-api-access-vftwd\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"38d77c46-58bc-4dd3-a874-85e5b14c1585\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Mar 16 00:32:29 crc kubenswrapper[4816]: I0316 00:32:29.520014 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-fs5z5-pull\" (UniqueName: \"kubernetes.io/secret/38d77c46-58bc-4dd3-a874-85e5b14c1585-builder-dockercfg-fs5z5-pull\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"38d77c46-58bc-4dd3-a874-85e5b14c1585\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Mar 16 00:32:29 crc kubenswrapper[4816]: I0316 00:32:29.520061 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/38d77c46-58bc-4dd3-a874-85e5b14c1585-node-pullsecrets\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"38d77c46-58bc-4dd3-a874-85e5b14c1585\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Mar 16 00:32:29 crc kubenswrapper[4816]: I0316 00:32:29.520078 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/38d77c46-58bc-4dd3-a874-85e5b14c1585-container-storage-run\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"38d77c46-58bc-4dd3-a874-85e5b14c1585\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Mar 16 00:32:29 crc kubenswrapper[4816]: I0316 00:32:29.520110 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/38d77c46-58bc-4dd3-a874-85e5b14c1585-buildworkdir\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"38d77c46-58bc-4dd3-a874-85e5b14c1585\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Mar 16 00:32:29 crc kubenswrapper[4816]: I0316 00:32:29.520127 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/38d77c46-58bc-4dd3-a874-85e5b14c1585-build-ca-bundles\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"38d77c46-58bc-4dd3-a874-85e5b14c1585\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Mar 16 00:32:29 crc kubenswrapper[4816]: I0316 00:32:29.621960 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-fs5z5-push\" (UniqueName: \"kubernetes.io/secret/38d77c46-58bc-4dd3-a874-85e5b14c1585-builder-dockercfg-fs5z5-push\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"38d77c46-58bc-4dd3-a874-85e5b14c1585\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Mar 16 00:32:29 crc kubenswrapper[4816]: I0316 00:32:29.622385 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/38d77c46-58bc-4dd3-a874-85e5b14c1585-build-proxy-ca-bundles\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"38d77c46-58bc-4dd3-a874-85e5b14c1585\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Mar 16 00:32:29 crc kubenswrapper[4816]: I0316 00:32:29.623055 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/38d77c46-58bc-4dd3-a874-85e5b14c1585-build-blob-cache\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"38d77c46-58bc-4dd3-a874-85e5b14c1585\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Mar 16 00:32:29 crc kubenswrapper[4816]: I0316 00:32:29.623399 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/38d77c46-58bc-4dd3-a874-85e5b14c1585-build-proxy-ca-bundles\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"38d77c46-58bc-4dd3-a874-85e5b14c1585\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Mar 16 00:32:29 crc kubenswrapper[4816]: I0316 00:32:29.623859 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/38d77c46-58bc-4dd3-a874-85e5b14c1585-build-blob-cache\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"38d77c46-58bc-4dd3-a874-85e5b14c1585\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Mar 16 00:32:29 crc kubenswrapper[4816]: I0316 00:32:29.624051 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vftwd\" (UniqueName: \"kubernetes.io/projected/38d77c46-58bc-4dd3-a874-85e5b14c1585-kube-api-access-vftwd\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"38d77c46-58bc-4dd3-a874-85e5b14c1585\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Mar 16 00:32:29 crc kubenswrapper[4816]: I0316 00:32:29.624202 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-fs5z5-pull\" (UniqueName: \"kubernetes.io/secret/38d77c46-58bc-4dd3-a874-85e5b14c1585-builder-dockercfg-fs5z5-pull\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"38d77c46-58bc-4dd3-a874-85e5b14c1585\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Mar 16 00:32:29 crc kubenswrapper[4816]: I0316 00:32:29.624310 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/38d77c46-58bc-4dd3-a874-85e5b14c1585-node-pullsecrets\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"38d77c46-58bc-4dd3-a874-85e5b14c1585\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Mar 16 00:32:29 crc kubenswrapper[4816]: I0316 00:32:29.624393 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/38d77c46-58bc-4dd3-a874-85e5b14c1585-container-storage-run\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"38d77c46-58bc-4dd3-a874-85e5b14c1585\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Mar 16 00:32:29 crc kubenswrapper[4816]: I0316 00:32:29.624529 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/38d77c46-58bc-4dd3-a874-85e5b14c1585-buildworkdir\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"38d77c46-58bc-4dd3-a874-85e5b14c1585\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Mar 16 00:32:29 crc kubenswrapper[4816]: I0316 00:32:29.624691 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/38d77c46-58bc-4dd3-a874-85e5b14c1585-build-ca-bundles\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"38d77c46-58bc-4dd3-a874-85e5b14c1585\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Mar 16 00:32:29 crc kubenswrapper[4816]: I0316 00:32:29.624836 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/38d77c46-58bc-4dd3-a874-85e5b14c1585-buildworkdir\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"38d77c46-58bc-4dd3-a874-85e5b14c1585\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Mar 16 00:32:29 crc kubenswrapper[4816]: I0316 00:32:29.624874 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/38d77c46-58bc-4dd3-a874-85e5b14c1585-node-pullsecrets\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"38d77c46-58bc-4dd3-a874-85e5b14c1585\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Mar 16 00:32:29 crc kubenswrapper[4816]: I0316 00:32:29.624737 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/38d77c46-58bc-4dd3-a874-85e5b14c1585-container-storage-run\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"38d77c46-58bc-4dd3-a874-85e5b14c1585\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Mar 16 00:32:29 crc kubenswrapper[4816]: I0316 00:32:29.625135 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/38d77c46-58bc-4dd3-a874-85e5b14c1585-container-storage-root\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"38d77c46-58bc-4dd3-a874-85e5b14c1585\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Mar 16 00:32:29 crc kubenswrapper[4816]: I0316 00:32:29.625223 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/38d77c46-58bc-4dd3-a874-85e5b14c1585-build-system-configs\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"38d77c46-58bc-4dd3-a874-85e5b14c1585\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Mar 16 00:32:29 crc kubenswrapper[4816]: I0316 00:32:29.625328 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/38d77c46-58bc-4dd3-a874-85e5b14c1585-buildcachedir\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"38d77c46-58bc-4dd3-a874-85e5b14c1585\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Mar 16 00:32:29 crc kubenswrapper[4816]: I0316 00:32:29.625540 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/38d77c46-58bc-4dd3-a874-85e5b14c1585-build-ca-bundles\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"38d77c46-58bc-4dd3-a874-85e5b14c1585\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Mar 16 00:32:29 crc kubenswrapper[4816]: I0316 00:32:29.625705 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/38d77c46-58bc-4dd3-a874-85e5b14c1585-buildcachedir\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"38d77c46-58bc-4dd3-a874-85e5b14c1585\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Mar 16 00:32:29 crc kubenswrapper[4816]: I0316 00:32:29.625814 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/38d77c46-58bc-4dd3-a874-85e5b14c1585-container-storage-root\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"38d77c46-58bc-4dd3-a874-85e5b14c1585\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Mar 16 00:32:29 crc kubenswrapper[4816]: I0316 00:32:29.626478 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/38d77c46-58bc-4dd3-a874-85e5b14c1585-build-system-configs\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"38d77c46-58bc-4dd3-a874-85e5b14c1585\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Mar 16 00:32:29 crc kubenswrapper[4816]: I0316 00:32:29.629658 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-fs5z5-pull\" (UniqueName: \"kubernetes.io/secret/38d77c46-58bc-4dd3-a874-85e5b14c1585-builder-dockercfg-fs5z5-pull\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"38d77c46-58bc-4dd3-a874-85e5b14c1585\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Mar 16 00:32:29 crc kubenswrapper[4816]: I0316 00:32:29.638079 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-fs5z5-push\" (UniqueName: \"kubernetes.io/secret/38d77c46-58bc-4dd3-a874-85e5b14c1585-builder-dockercfg-fs5z5-push\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"38d77c46-58bc-4dd3-a874-85e5b14c1585\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Mar 16 00:32:29 crc kubenswrapper[4816]: I0316 00:32:29.646163 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vftwd\" (UniqueName: \"kubernetes.io/projected/38d77c46-58bc-4dd3-a874-85e5b14c1585-kube-api-access-vftwd\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"38d77c46-58bc-4dd3-a874-85e5b14c1585\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Mar 16 00:32:29 crc kubenswrapper[4816]: I0316 00:32:29.756301 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-operator-bundle-1-build" Mar 16 00:32:30 crc kubenswrapper[4816]: W0316 00:32:30.021235 4816 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod38d77c46_58bc_4dd3_a874_85e5b14c1585.slice/crio-eca8ebc315293f56134648b608d393b026abc700dcac227487eb6afcf4852dbb WatchSource:0}: Error finding container eca8ebc315293f56134648b608d393b026abc700dcac227487eb6afcf4852dbb: Status 404 returned error can't find the container with id eca8ebc315293f56134648b608d393b026abc700dcac227487eb6afcf4852dbb Mar 16 00:32:30 crc kubenswrapper[4816]: I0316 00:32:30.027166 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/service-telemetry-operator-bundle-1-build"] Mar 16 00:32:30 crc kubenswrapper[4816]: I0316 00:32:30.903455 4816 generic.go:334] "Generic (PLEG): container finished" podID="38d77c46-58bc-4dd3-a874-85e5b14c1585" containerID="520f1908678615d0e5b73ebdbbe6a48ebe9c84b2afda6fc6f03c6d31b9a2fb39" exitCode=0 Mar 16 00:32:30 crc kubenswrapper[4816]: I0316 00:32:30.903580 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-bundle-1-build" event={"ID":"38d77c46-58bc-4dd3-a874-85e5b14c1585","Type":"ContainerDied","Data":"520f1908678615d0e5b73ebdbbe6a48ebe9c84b2afda6fc6f03c6d31b9a2fb39"} Mar 16 00:32:30 crc kubenswrapper[4816]: I0316 00:32:30.903848 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-bundle-1-build" event={"ID":"38d77c46-58bc-4dd3-a874-85e5b14c1585","Type":"ContainerStarted","Data":"eca8ebc315293f56134648b608d393b026abc700dcac227487eb6afcf4852dbb"} Mar 16 00:32:31 crc kubenswrapper[4816]: I0316 00:32:31.384309 4816 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-tmfj2" Mar 16 00:32:31 crc kubenswrapper[4816]: I0316 00:32:31.384381 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-tmfj2" Mar 16 00:32:31 crc kubenswrapper[4816]: I0316 00:32:31.863674 4816 patch_prober.go:28] interesting pod/machine-config-daemon-jrdcz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 16 00:32:31 crc kubenswrapper[4816]: I0316 00:32:31.863768 4816 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jrdcz" podUID="dd08ece2-7636-4966-973a-e96a34b70b53" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 16 00:32:31 crc kubenswrapper[4816]: I0316 00:32:31.918643 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-bundle-1-build" event={"ID":"38d77c46-58bc-4dd3-a874-85e5b14c1585","Type":"ContainerStarted","Data":"d67550a0f7d36330df42139700602ca21750bc1c1f58bc1cc3a210d0f86409d3"} Mar 16 00:32:31 crc kubenswrapper[4816]: I0316 00:32:31.945716 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/service-telemetry-operator-bundle-1-build" podStartSLOduration=2.945694067 podStartE2EDuration="2.945694067s" podCreationTimestamp="2026-03-16 00:32:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-16 00:32:31.943846666 +0000 UTC m=+1545.040146629" watchObservedRunningTime="2026-03-16 00:32:31.945694067 +0000 UTC m=+1545.041994020" Mar 16 00:32:32 crc kubenswrapper[4816]: I0316 00:32:32.431765 4816 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-tmfj2" podUID="05447e0c-cfec-4548-a367-b4058cd9ee40" containerName="registry-server" probeResult="failure" output=< Mar 16 00:32:32 crc kubenswrapper[4816]: timeout: failed to connect service ":50051" within 1s Mar 16 00:32:32 crc kubenswrapper[4816]: > Mar 16 00:32:33 crc kubenswrapper[4816]: I0316 00:32:33.935365 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_service-telemetry-operator-bundle-1-build_38d77c46-58bc-4dd3-a874-85e5b14c1585/docker-build/0.log" Mar 16 00:32:33 crc kubenswrapper[4816]: I0316 00:32:33.937216 4816 generic.go:334] "Generic (PLEG): container finished" podID="38d77c46-58bc-4dd3-a874-85e5b14c1585" containerID="d67550a0f7d36330df42139700602ca21750bc1c1f58bc1cc3a210d0f86409d3" exitCode=1 Mar 16 00:32:33 crc kubenswrapper[4816]: I0316 00:32:33.937288 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-bundle-1-build" event={"ID":"38d77c46-58bc-4dd3-a874-85e5b14c1585","Type":"ContainerDied","Data":"d67550a0f7d36330df42139700602ca21750bc1c1f58bc1cc3a210d0f86409d3"} Mar 16 00:32:35 crc kubenswrapper[4816]: I0316 00:32:35.158910 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_service-telemetry-operator-bundle-1-build_38d77c46-58bc-4dd3-a874-85e5b14c1585/docker-build/0.log" Mar 16 00:32:35 crc kubenswrapper[4816]: I0316 00:32:35.159804 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-operator-bundle-1-build" Mar 16 00:32:35 crc kubenswrapper[4816]: I0316 00:32:35.301317 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/38d77c46-58bc-4dd3-a874-85e5b14c1585-buildcachedir\") pod \"38d77c46-58bc-4dd3-a874-85e5b14c1585\" (UID: \"38d77c46-58bc-4dd3-a874-85e5b14c1585\") " Mar 16 00:32:35 crc kubenswrapper[4816]: I0316 00:32:35.301377 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/38d77c46-58bc-4dd3-a874-85e5b14c1585-container-storage-root\") pod \"38d77c46-58bc-4dd3-a874-85e5b14c1585\" (UID: \"38d77c46-58bc-4dd3-a874-85e5b14c1585\") " Mar 16 00:32:35 crc kubenswrapper[4816]: I0316 00:32:35.301404 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/38d77c46-58bc-4dd3-a874-85e5b14c1585-buildcachedir" (OuterVolumeSpecName: "buildcachedir") pod "38d77c46-58bc-4dd3-a874-85e5b14c1585" (UID: "38d77c46-58bc-4dd3-a874-85e5b14c1585"). InnerVolumeSpecName "buildcachedir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 16 00:32:35 crc kubenswrapper[4816]: I0316 00:32:35.301424 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/38d77c46-58bc-4dd3-a874-85e5b14c1585-buildworkdir\") pod \"38d77c46-58bc-4dd3-a874-85e5b14c1585\" (UID: \"38d77c46-58bc-4dd3-a874-85e5b14c1585\") " Mar 16 00:32:35 crc kubenswrapper[4816]: I0316 00:32:35.301476 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vftwd\" (UniqueName: \"kubernetes.io/projected/38d77c46-58bc-4dd3-a874-85e5b14c1585-kube-api-access-vftwd\") pod \"38d77c46-58bc-4dd3-a874-85e5b14c1585\" (UID: \"38d77c46-58bc-4dd3-a874-85e5b14c1585\") " Mar 16 00:32:35 crc kubenswrapper[4816]: I0316 00:32:35.301510 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/38d77c46-58bc-4dd3-a874-85e5b14c1585-build-system-configs\") pod \"38d77c46-58bc-4dd3-a874-85e5b14c1585\" (UID: \"38d77c46-58bc-4dd3-a874-85e5b14c1585\") " Mar 16 00:32:35 crc kubenswrapper[4816]: I0316 00:32:35.301568 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/38d77c46-58bc-4dd3-a874-85e5b14c1585-build-blob-cache\") pod \"38d77c46-58bc-4dd3-a874-85e5b14c1585\" (UID: \"38d77c46-58bc-4dd3-a874-85e5b14c1585\") " Mar 16 00:32:35 crc kubenswrapper[4816]: I0316 00:32:35.302050 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/38d77c46-58bc-4dd3-a874-85e5b14c1585-build-blob-cache" (OuterVolumeSpecName: "build-blob-cache") pod "38d77c46-58bc-4dd3-a874-85e5b14c1585" (UID: "38d77c46-58bc-4dd3-a874-85e5b14c1585"). InnerVolumeSpecName "build-blob-cache". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 16 00:32:35 crc kubenswrapper[4816]: I0316 00:32:35.302128 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/38d77c46-58bc-4dd3-a874-85e5b14c1585-container-storage-run\") pod \"38d77c46-58bc-4dd3-a874-85e5b14c1585\" (UID: \"38d77c46-58bc-4dd3-a874-85e5b14c1585\") " Mar 16 00:32:35 crc kubenswrapper[4816]: I0316 00:32:35.302191 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/38d77c46-58bc-4dd3-a874-85e5b14c1585-node-pullsecrets\") pod \"38d77c46-58bc-4dd3-a874-85e5b14c1585\" (UID: \"38d77c46-58bc-4dd3-a874-85e5b14c1585\") " Mar 16 00:32:35 crc kubenswrapper[4816]: I0316 00:32:35.302190 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/38d77c46-58bc-4dd3-a874-85e5b14c1585-buildworkdir" (OuterVolumeSpecName: "buildworkdir") pod "38d77c46-58bc-4dd3-a874-85e5b14c1585" (UID: "38d77c46-58bc-4dd3-a874-85e5b14c1585"). InnerVolumeSpecName "buildworkdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 16 00:32:35 crc kubenswrapper[4816]: I0316 00:32:35.302252 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/38d77c46-58bc-4dd3-a874-85e5b14c1585-node-pullsecrets" (OuterVolumeSpecName: "node-pullsecrets") pod "38d77c46-58bc-4dd3-a874-85e5b14c1585" (UID: "38d77c46-58bc-4dd3-a874-85e5b14c1585"). InnerVolumeSpecName "node-pullsecrets". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 16 00:32:35 crc kubenswrapper[4816]: I0316 00:32:35.302293 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/38d77c46-58bc-4dd3-a874-85e5b14c1585-build-ca-bundles\") pod \"38d77c46-58bc-4dd3-a874-85e5b14c1585\" (UID: \"38d77c46-58bc-4dd3-a874-85e5b14c1585\") " Mar 16 00:32:35 crc kubenswrapper[4816]: I0316 00:32:35.302318 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-fs5z5-push\" (UniqueName: \"kubernetes.io/secret/38d77c46-58bc-4dd3-a874-85e5b14c1585-builder-dockercfg-fs5z5-push\") pod \"38d77c46-58bc-4dd3-a874-85e5b14c1585\" (UID: \"38d77c46-58bc-4dd3-a874-85e5b14c1585\") " Mar 16 00:32:35 crc kubenswrapper[4816]: I0316 00:32:35.302384 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/38d77c46-58bc-4dd3-a874-85e5b14c1585-build-proxy-ca-bundles\") pod \"38d77c46-58bc-4dd3-a874-85e5b14c1585\" (UID: \"38d77c46-58bc-4dd3-a874-85e5b14c1585\") " Mar 16 00:32:35 crc kubenswrapper[4816]: I0316 00:32:35.302500 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/38d77c46-58bc-4dd3-a874-85e5b14c1585-build-system-configs" (OuterVolumeSpecName: "build-system-configs") pod "38d77c46-58bc-4dd3-a874-85e5b14c1585" (UID: "38d77c46-58bc-4dd3-a874-85e5b14c1585"). InnerVolumeSpecName "build-system-configs". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:32:35 crc kubenswrapper[4816]: I0316 00:32:35.302598 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-fs5z5-pull\" (UniqueName: \"kubernetes.io/secret/38d77c46-58bc-4dd3-a874-85e5b14c1585-builder-dockercfg-fs5z5-pull\") pod \"38d77c46-58bc-4dd3-a874-85e5b14c1585\" (UID: \"38d77c46-58bc-4dd3-a874-85e5b14c1585\") " Mar 16 00:32:35 crc kubenswrapper[4816]: I0316 00:32:35.303253 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/38d77c46-58bc-4dd3-a874-85e5b14c1585-container-storage-run" (OuterVolumeSpecName: "container-storage-run") pod "38d77c46-58bc-4dd3-a874-85e5b14c1585" (UID: "38d77c46-58bc-4dd3-a874-85e5b14c1585"). InnerVolumeSpecName "container-storage-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 16 00:32:35 crc kubenswrapper[4816]: I0316 00:32:35.303364 4816 reconciler_common.go:293] "Volume detached for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/38d77c46-58bc-4dd3-a874-85e5b14c1585-build-system-configs\") on node \"crc\" DevicePath \"\"" Mar 16 00:32:35 crc kubenswrapper[4816]: I0316 00:32:35.303387 4816 reconciler_common.go:293] "Volume detached for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/38d77c46-58bc-4dd3-a874-85e5b14c1585-build-blob-cache\") on node \"crc\" DevicePath \"\"" Mar 16 00:32:35 crc kubenswrapper[4816]: I0316 00:32:35.303378 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/38d77c46-58bc-4dd3-a874-85e5b14c1585-build-proxy-ca-bundles" (OuterVolumeSpecName: "build-proxy-ca-bundles") pod "38d77c46-58bc-4dd3-a874-85e5b14c1585" (UID: "38d77c46-58bc-4dd3-a874-85e5b14c1585"). InnerVolumeSpecName "build-proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:32:35 crc kubenswrapper[4816]: I0316 00:32:35.303400 4816 reconciler_common.go:293] "Volume detached for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/38d77c46-58bc-4dd3-a874-85e5b14c1585-container-storage-run\") on node \"crc\" DevicePath \"\"" Mar 16 00:32:35 crc kubenswrapper[4816]: I0316 00:32:35.303442 4816 reconciler_common.go:293] "Volume detached for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/38d77c46-58bc-4dd3-a874-85e5b14c1585-node-pullsecrets\") on node \"crc\" DevicePath \"\"" Mar 16 00:32:35 crc kubenswrapper[4816]: I0316 00:32:35.303463 4816 reconciler_common.go:293] "Volume detached for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/38d77c46-58bc-4dd3-a874-85e5b14c1585-buildcachedir\") on node \"crc\" DevicePath \"\"" Mar 16 00:32:35 crc kubenswrapper[4816]: I0316 00:32:35.303481 4816 reconciler_common.go:293] "Volume detached for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/38d77c46-58bc-4dd3-a874-85e5b14c1585-buildworkdir\") on node \"crc\" DevicePath \"\"" Mar 16 00:32:35 crc kubenswrapper[4816]: I0316 00:32:35.303627 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/38d77c46-58bc-4dd3-a874-85e5b14c1585-build-ca-bundles" (OuterVolumeSpecName: "build-ca-bundles") pod "38d77c46-58bc-4dd3-a874-85e5b14c1585" (UID: "38d77c46-58bc-4dd3-a874-85e5b14c1585"). InnerVolumeSpecName "build-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:32:35 crc kubenswrapper[4816]: I0316 00:32:35.303976 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/38d77c46-58bc-4dd3-a874-85e5b14c1585-container-storage-root" (OuterVolumeSpecName: "container-storage-root") pod "38d77c46-58bc-4dd3-a874-85e5b14c1585" (UID: "38d77c46-58bc-4dd3-a874-85e5b14c1585"). InnerVolumeSpecName "container-storage-root". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 16 00:32:35 crc kubenswrapper[4816]: I0316 00:32:35.308003 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/38d77c46-58bc-4dd3-a874-85e5b14c1585-builder-dockercfg-fs5z5-pull" (OuterVolumeSpecName: "builder-dockercfg-fs5z5-pull") pod "38d77c46-58bc-4dd3-a874-85e5b14c1585" (UID: "38d77c46-58bc-4dd3-a874-85e5b14c1585"). InnerVolumeSpecName "builder-dockercfg-fs5z5-pull". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 16 00:32:35 crc kubenswrapper[4816]: I0316 00:32:35.308080 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/38d77c46-58bc-4dd3-a874-85e5b14c1585-builder-dockercfg-fs5z5-push" (OuterVolumeSpecName: "builder-dockercfg-fs5z5-push") pod "38d77c46-58bc-4dd3-a874-85e5b14c1585" (UID: "38d77c46-58bc-4dd3-a874-85e5b14c1585"). InnerVolumeSpecName "builder-dockercfg-fs5z5-push". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 16 00:32:35 crc kubenswrapper[4816]: I0316 00:32:35.309602 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/38d77c46-58bc-4dd3-a874-85e5b14c1585-kube-api-access-vftwd" (OuterVolumeSpecName: "kube-api-access-vftwd") pod "38d77c46-58bc-4dd3-a874-85e5b14c1585" (UID: "38d77c46-58bc-4dd3-a874-85e5b14c1585"). InnerVolumeSpecName "kube-api-access-vftwd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:32:35 crc kubenswrapper[4816]: I0316 00:32:35.405448 4816 reconciler_common.go:293] "Volume detached for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/38d77c46-58bc-4dd3-a874-85e5b14c1585-build-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 16 00:32:35 crc kubenswrapper[4816]: I0316 00:32:35.405506 4816 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-fs5z5-push\" (UniqueName: \"kubernetes.io/secret/38d77c46-58bc-4dd3-a874-85e5b14c1585-builder-dockercfg-fs5z5-push\") on node \"crc\" DevicePath \"\"" Mar 16 00:32:35 crc kubenswrapper[4816]: I0316 00:32:35.405528 4816 reconciler_common.go:293] "Volume detached for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/38d77c46-58bc-4dd3-a874-85e5b14c1585-build-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 16 00:32:35 crc kubenswrapper[4816]: I0316 00:32:35.405547 4816 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-fs5z5-pull\" (UniqueName: \"kubernetes.io/secret/38d77c46-58bc-4dd3-a874-85e5b14c1585-builder-dockercfg-fs5z5-pull\") on node \"crc\" DevicePath \"\"" Mar 16 00:32:35 crc kubenswrapper[4816]: I0316 00:32:35.405601 4816 reconciler_common.go:293] "Volume detached for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/38d77c46-58bc-4dd3-a874-85e5b14c1585-container-storage-root\") on node \"crc\" DevicePath \"\"" Mar 16 00:32:35 crc kubenswrapper[4816]: I0316 00:32:35.405624 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vftwd\" (UniqueName: \"kubernetes.io/projected/38d77c46-58bc-4dd3-a874-85e5b14c1585-kube-api-access-vftwd\") on node \"crc\" DevicePath \"\"" Mar 16 00:32:35 crc kubenswrapper[4816]: I0316 00:32:35.952172 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_service-telemetry-operator-bundle-1-build_38d77c46-58bc-4dd3-a874-85e5b14c1585/docker-build/0.log" Mar 16 00:32:35 crc kubenswrapper[4816]: I0316 00:32:35.953172 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-bundle-1-build" event={"ID":"38d77c46-58bc-4dd3-a874-85e5b14c1585","Type":"ContainerDied","Data":"eca8ebc315293f56134648b608d393b026abc700dcac227487eb6afcf4852dbb"} Mar 16 00:32:35 crc kubenswrapper[4816]: I0316 00:32:35.953220 4816 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="eca8ebc315293f56134648b608d393b026abc700dcac227487eb6afcf4852dbb" Mar 16 00:32:35 crc kubenswrapper[4816]: I0316 00:32:35.953254 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-operator-bundle-1-build" Mar 16 00:32:39 crc kubenswrapper[4816]: I0316 00:32:39.923715 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["service-telemetry/service-telemetry-operator-bundle-1-build"] Mar 16 00:32:39 crc kubenswrapper[4816]: I0316 00:32:39.931998 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["service-telemetry/service-telemetry-operator-bundle-1-build"] Mar 16 00:32:41 crc kubenswrapper[4816]: I0316 00:32:41.425649 4816 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-tmfj2" Mar 16 00:32:41 crc kubenswrapper[4816]: I0316 00:32:41.470156 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-tmfj2" Mar 16 00:32:41 crc kubenswrapper[4816]: I0316 00:32:41.549753 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/service-telemetry-operator-bundle-2-build"] Mar 16 00:32:41 crc kubenswrapper[4816]: E0316 00:32:41.550041 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="38d77c46-58bc-4dd3-a874-85e5b14c1585" containerName="docker-build" Mar 16 00:32:41 crc kubenswrapper[4816]: I0316 00:32:41.550067 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="38d77c46-58bc-4dd3-a874-85e5b14c1585" containerName="docker-build" Mar 16 00:32:41 crc kubenswrapper[4816]: E0316 00:32:41.550090 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="38d77c46-58bc-4dd3-a874-85e5b14c1585" containerName="manage-dockerfile" Mar 16 00:32:41 crc kubenswrapper[4816]: I0316 00:32:41.550099 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="38d77c46-58bc-4dd3-a874-85e5b14c1585" containerName="manage-dockerfile" Mar 16 00:32:41 crc kubenswrapper[4816]: I0316 00:32:41.550226 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="38d77c46-58bc-4dd3-a874-85e5b14c1585" containerName="docker-build" Mar 16 00:32:41 crc kubenswrapper[4816]: I0316 00:32:41.551443 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-operator-bundle-2-build" Mar 16 00:32:41 crc kubenswrapper[4816]: I0316 00:32:41.555285 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"service-telemetry-operator-bundle-2-ca" Mar 16 00:32:41 crc kubenswrapper[4816]: I0316 00:32:41.555420 4816 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"builder-dockercfg-fs5z5" Mar 16 00:32:41 crc kubenswrapper[4816]: I0316 00:32:41.555310 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"service-telemetry-operator-bundle-2-global-ca" Mar 16 00:32:41 crc kubenswrapper[4816]: I0316 00:32:41.555849 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"service-telemetry-operator-bundle-2-sys-config" Mar 16 00:32:41 crc kubenswrapper[4816]: I0316 00:32:41.570229 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/service-telemetry-operator-bundle-2-build"] Mar 16 00:32:41 crc kubenswrapper[4816]: I0316 00:32:41.663591 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-tmfj2"] Mar 16 00:32:41 crc kubenswrapper[4816]: I0316 00:32:41.675775 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="38d77c46-58bc-4dd3-a874-85e5b14c1585" path="/var/lib/kubelet/pods/38d77c46-58bc-4dd3-a874-85e5b14c1585/volumes" Mar 16 00:32:41 crc kubenswrapper[4816]: I0316 00:32:41.688262 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/4b2b3bef-a66e-4caa-bf69-164562b1dfd6-build-blob-cache\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"4b2b3bef-a66e-4caa-bf69-164562b1dfd6\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Mar 16 00:32:41 crc kubenswrapper[4816]: I0316 00:32:41.688298 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/4b2b3bef-a66e-4caa-bf69-164562b1dfd6-buildworkdir\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"4b2b3bef-a66e-4caa-bf69-164562b1dfd6\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Mar 16 00:32:41 crc kubenswrapper[4816]: I0316 00:32:41.688321 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-fs5z5-pull\" (UniqueName: \"kubernetes.io/secret/4b2b3bef-a66e-4caa-bf69-164562b1dfd6-builder-dockercfg-fs5z5-pull\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"4b2b3bef-a66e-4caa-bf69-164562b1dfd6\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Mar 16 00:32:41 crc kubenswrapper[4816]: I0316 00:32:41.688342 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/4b2b3bef-a66e-4caa-bf69-164562b1dfd6-build-ca-bundles\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"4b2b3bef-a66e-4caa-bf69-164562b1dfd6\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Mar 16 00:32:41 crc kubenswrapper[4816]: I0316 00:32:41.688368 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/4b2b3bef-a66e-4caa-bf69-164562b1dfd6-container-storage-run\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"4b2b3bef-a66e-4caa-bf69-164562b1dfd6\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Mar 16 00:32:41 crc kubenswrapper[4816]: I0316 00:32:41.688387 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-fs5z5-push\" (UniqueName: \"kubernetes.io/secret/4b2b3bef-a66e-4caa-bf69-164562b1dfd6-builder-dockercfg-fs5z5-push\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"4b2b3bef-a66e-4caa-bf69-164562b1dfd6\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Mar 16 00:32:41 crc kubenswrapper[4816]: I0316 00:32:41.688406 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/4b2b3bef-a66e-4caa-bf69-164562b1dfd6-build-proxy-ca-bundles\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"4b2b3bef-a66e-4caa-bf69-164562b1dfd6\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Mar 16 00:32:41 crc kubenswrapper[4816]: I0316 00:32:41.688423 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/4b2b3bef-a66e-4caa-bf69-164562b1dfd6-container-storage-root\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"4b2b3bef-a66e-4caa-bf69-164562b1dfd6\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Mar 16 00:32:41 crc kubenswrapper[4816]: I0316 00:32:41.688442 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4nqgb\" (UniqueName: \"kubernetes.io/projected/4b2b3bef-a66e-4caa-bf69-164562b1dfd6-kube-api-access-4nqgb\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"4b2b3bef-a66e-4caa-bf69-164562b1dfd6\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Mar 16 00:32:41 crc kubenswrapper[4816]: I0316 00:32:41.688463 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/4b2b3bef-a66e-4caa-bf69-164562b1dfd6-node-pullsecrets\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"4b2b3bef-a66e-4caa-bf69-164562b1dfd6\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Mar 16 00:32:41 crc kubenswrapper[4816]: I0316 00:32:41.688632 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/4b2b3bef-a66e-4caa-bf69-164562b1dfd6-buildcachedir\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"4b2b3bef-a66e-4caa-bf69-164562b1dfd6\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Mar 16 00:32:41 crc kubenswrapper[4816]: I0316 00:32:41.688693 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/4b2b3bef-a66e-4caa-bf69-164562b1dfd6-build-system-configs\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"4b2b3bef-a66e-4caa-bf69-164562b1dfd6\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Mar 16 00:32:41 crc kubenswrapper[4816]: I0316 00:32:41.789749 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/4b2b3bef-a66e-4caa-bf69-164562b1dfd6-build-system-configs\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"4b2b3bef-a66e-4caa-bf69-164562b1dfd6\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Mar 16 00:32:41 crc kubenswrapper[4816]: I0316 00:32:41.789824 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/4b2b3bef-a66e-4caa-bf69-164562b1dfd6-build-blob-cache\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"4b2b3bef-a66e-4caa-bf69-164562b1dfd6\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Mar 16 00:32:41 crc kubenswrapper[4816]: I0316 00:32:41.789844 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/4b2b3bef-a66e-4caa-bf69-164562b1dfd6-buildworkdir\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"4b2b3bef-a66e-4caa-bf69-164562b1dfd6\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Mar 16 00:32:41 crc kubenswrapper[4816]: I0316 00:32:41.789863 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-fs5z5-pull\" (UniqueName: \"kubernetes.io/secret/4b2b3bef-a66e-4caa-bf69-164562b1dfd6-builder-dockercfg-fs5z5-pull\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"4b2b3bef-a66e-4caa-bf69-164562b1dfd6\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Mar 16 00:32:41 crc kubenswrapper[4816]: I0316 00:32:41.789882 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/4b2b3bef-a66e-4caa-bf69-164562b1dfd6-build-ca-bundles\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"4b2b3bef-a66e-4caa-bf69-164562b1dfd6\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Mar 16 00:32:41 crc kubenswrapper[4816]: I0316 00:32:41.789913 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/4b2b3bef-a66e-4caa-bf69-164562b1dfd6-container-storage-run\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"4b2b3bef-a66e-4caa-bf69-164562b1dfd6\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Mar 16 00:32:41 crc kubenswrapper[4816]: I0316 00:32:41.789932 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-fs5z5-push\" (UniqueName: \"kubernetes.io/secret/4b2b3bef-a66e-4caa-bf69-164562b1dfd6-builder-dockercfg-fs5z5-push\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"4b2b3bef-a66e-4caa-bf69-164562b1dfd6\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Mar 16 00:32:41 crc kubenswrapper[4816]: I0316 00:32:41.789950 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/4b2b3bef-a66e-4caa-bf69-164562b1dfd6-build-proxy-ca-bundles\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"4b2b3bef-a66e-4caa-bf69-164562b1dfd6\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Mar 16 00:32:41 crc kubenswrapper[4816]: I0316 00:32:41.789966 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/4b2b3bef-a66e-4caa-bf69-164562b1dfd6-container-storage-root\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"4b2b3bef-a66e-4caa-bf69-164562b1dfd6\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Mar 16 00:32:41 crc kubenswrapper[4816]: I0316 00:32:41.789984 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4nqgb\" (UniqueName: \"kubernetes.io/projected/4b2b3bef-a66e-4caa-bf69-164562b1dfd6-kube-api-access-4nqgb\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"4b2b3bef-a66e-4caa-bf69-164562b1dfd6\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Mar 16 00:32:41 crc kubenswrapper[4816]: I0316 00:32:41.790004 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/4b2b3bef-a66e-4caa-bf69-164562b1dfd6-node-pullsecrets\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"4b2b3bef-a66e-4caa-bf69-164562b1dfd6\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Mar 16 00:32:41 crc kubenswrapper[4816]: I0316 00:32:41.790030 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/4b2b3bef-a66e-4caa-bf69-164562b1dfd6-buildcachedir\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"4b2b3bef-a66e-4caa-bf69-164562b1dfd6\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Mar 16 00:32:41 crc kubenswrapper[4816]: I0316 00:32:41.790086 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/4b2b3bef-a66e-4caa-bf69-164562b1dfd6-buildcachedir\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"4b2b3bef-a66e-4caa-bf69-164562b1dfd6\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Mar 16 00:32:41 crc kubenswrapper[4816]: I0316 00:32:41.790151 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/4b2b3bef-a66e-4caa-bf69-164562b1dfd6-node-pullsecrets\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"4b2b3bef-a66e-4caa-bf69-164562b1dfd6\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Mar 16 00:32:41 crc kubenswrapper[4816]: I0316 00:32:41.790458 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/4b2b3bef-a66e-4caa-bf69-164562b1dfd6-container-storage-run\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"4b2b3bef-a66e-4caa-bf69-164562b1dfd6\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Mar 16 00:32:41 crc kubenswrapper[4816]: I0316 00:32:41.790495 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/4b2b3bef-a66e-4caa-bf69-164562b1dfd6-container-storage-root\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"4b2b3bef-a66e-4caa-bf69-164562b1dfd6\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Mar 16 00:32:41 crc kubenswrapper[4816]: I0316 00:32:41.790817 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/4b2b3bef-a66e-4caa-bf69-164562b1dfd6-build-proxy-ca-bundles\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"4b2b3bef-a66e-4caa-bf69-164562b1dfd6\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Mar 16 00:32:41 crc kubenswrapper[4816]: I0316 00:32:41.791705 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/4b2b3bef-a66e-4caa-bf69-164562b1dfd6-buildworkdir\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"4b2b3bef-a66e-4caa-bf69-164562b1dfd6\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Mar 16 00:32:41 crc kubenswrapper[4816]: I0316 00:32:41.791757 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/4b2b3bef-a66e-4caa-bf69-164562b1dfd6-build-system-configs\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"4b2b3bef-a66e-4caa-bf69-164562b1dfd6\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Mar 16 00:32:41 crc kubenswrapper[4816]: I0316 00:32:41.791888 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/4b2b3bef-a66e-4caa-bf69-164562b1dfd6-build-ca-bundles\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"4b2b3bef-a66e-4caa-bf69-164562b1dfd6\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Mar 16 00:32:41 crc kubenswrapper[4816]: I0316 00:32:41.792023 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/4b2b3bef-a66e-4caa-bf69-164562b1dfd6-build-blob-cache\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"4b2b3bef-a66e-4caa-bf69-164562b1dfd6\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Mar 16 00:32:41 crc kubenswrapper[4816]: I0316 00:32:41.794539 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-fs5z5-push\" (UniqueName: \"kubernetes.io/secret/4b2b3bef-a66e-4caa-bf69-164562b1dfd6-builder-dockercfg-fs5z5-push\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"4b2b3bef-a66e-4caa-bf69-164562b1dfd6\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Mar 16 00:32:41 crc kubenswrapper[4816]: I0316 00:32:41.796204 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-fs5z5-pull\" (UniqueName: \"kubernetes.io/secret/4b2b3bef-a66e-4caa-bf69-164562b1dfd6-builder-dockercfg-fs5z5-pull\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"4b2b3bef-a66e-4caa-bf69-164562b1dfd6\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Mar 16 00:32:41 crc kubenswrapper[4816]: I0316 00:32:41.810235 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4nqgb\" (UniqueName: \"kubernetes.io/projected/4b2b3bef-a66e-4caa-bf69-164562b1dfd6-kube-api-access-4nqgb\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"4b2b3bef-a66e-4caa-bf69-164562b1dfd6\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Mar 16 00:32:41 crc kubenswrapper[4816]: I0316 00:32:41.866751 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-operator-bundle-2-build" Mar 16 00:32:42 crc kubenswrapper[4816]: I0316 00:32:42.047776 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/service-telemetry-operator-bundle-2-build"] Mar 16 00:32:43 crc kubenswrapper[4816]: I0316 00:32:43.003692 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-bundle-2-build" event={"ID":"4b2b3bef-a66e-4caa-bf69-164562b1dfd6","Type":"ContainerStarted","Data":"0366a024e1fb0871713f897d2b15717fdfe061cb2143d61741b0493ff4eab489"} Mar 16 00:32:43 crc kubenswrapper[4816]: I0316 00:32:43.004431 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-bundle-2-build" event={"ID":"4b2b3bef-a66e-4caa-bf69-164562b1dfd6","Type":"ContainerStarted","Data":"4e56eb9def807a83219554ca3506a988f161f2353f6b9de86cc0c12cc3ff1192"} Mar 16 00:32:43 crc kubenswrapper[4816]: I0316 00:32:43.003779 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-tmfj2" podUID="05447e0c-cfec-4548-a367-b4058cd9ee40" containerName="registry-server" containerID="cri-o://ad40cf381627305d1b58213b1985a792b55e9459828e9e50b09a60ed1eb6cd2a" gracePeriod=2 Mar 16 00:32:43 crc kubenswrapper[4816]: I0316 00:32:43.370248 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-tmfj2" Mar 16 00:32:43 crc kubenswrapper[4816]: I0316 00:32:43.526951 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/05447e0c-cfec-4548-a367-b4058cd9ee40-utilities\") pod \"05447e0c-cfec-4548-a367-b4058cd9ee40\" (UID: \"05447e0c-cfec-4548-a367-b4058cd9ee40\") " Mar 16 00:32:43 crc kubenswrapper[4816]: I0316 00:32:43.527086 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/05447e0c-cfec-4548-a367-b4058cd9ee40-catalog-content\") pod \"05447e0c-cfec-4548-a367-b4058cd9ee40\" (UID: \"05447e0c-cfec-4548-a367-b4058cd9ee40\") " Mar 16 00:32:43 crc kubenswrapper[4816]: I0316 00:32:43.527206 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9bmlt\" (UniqueName: \"kubernetes.io/projected/05447e0c-cfec-4548-a367-b4058cd9ee40-kube-api-access-9bmlt\") pod \"05447e0c-cfec-4548-a367-b4058cd9ee40\" (UID: \"05447e0c-cfec-4548-a367-b4058cd9ee40\") " Mar 16 00:32:43 crc kubenswrapper[4816]: I0316 00:32:43.528108 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/05447e0c-cfec-4548-a367-b4058cd9ee40-utilities" (OuterVolumeSpecName: "utilities") pod "05447e0c-cfec-4548-a367-b4058cd9ee40" (UID: "05447e0c-cfec-4548-a367-b4058cd9ee40"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 16 00:32:43 crc kubenswrapper[4816]: I0316 00:32:43.532975 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/05447e0c-cfec-4548-a367-b4058cd9ee40-kube-api-access-9bmlt" (OuterVolumeSpecName: "kube-api-access-9bmlt") pod "05447e0c-cfec-4548-a367-b4058cd9ee40" (UID: "05447e0c-cfec-4548-a367-b4058cd9ee40"). InnerVolumeSpecName "kube-api-access-9bmlt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:32:43 crc kubenswrapper[4816]: I0316 00:32:43.629099 4816 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/05447e0c-cfec-4548-a367-b4058cd9ee40-utilities\") on node \"crc\" DevicePath \"\"" Mar 16 00:32:43 crc kubenswrapper[4816]: I0316 00:32:43.629143 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9bmlt\" (UniqueName: \"kubernetes.io/projected/05447e0c-cfec-4548-a367-b4058cd9ee40-kube-api-access-9bmlt\") on node \"crc\" DevicePath \"\"" Mar 16 00:32:43 crc kubenswrapper[4816]: I0316 00:32:43.722343 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/05447e0c-cfec-4548-a367-b4058cd9ee40-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "05447e0c-cfec-4548-a367-b4058cd9ee40" (UID: "05447e0c-cfec-4548-a367-b4058cd9ee40"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 16 00:32:43 crc kubenswrapper[4816]: I0316 00:32:43.730187 4816 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/05447e0c-cfec-4548-a367-b4058cd9ee40-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 16 00:32:44 crc kubenswrapper[4816]: I0316 00:32:44.012450 4816 generic.go:334] "Generic (PLEG): container finished" podID="05447e0c-cfec-4548-a367-b4058cd9ee40" containerID="ad40cf381627305d1b58213b1985a792b55e9459828e9e50b09a60ed1eb6cd2a" exitCode=0 Mar 16 00:32:44 crc kubenswrapper[4816]: I0316 00:32:44.012535 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-tmfj2" Mar 16 00:32:44 crc kubenswrapper[4816]: I0316 00:32:44.012566 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tmfj2" event={"ID":"05447e0c-cfec-4548-a367-b4058cd9ee40","Type":"ContainerDied","Data":"ad40cf381627305d1b58213b1985a792b55e9459828e9e50b09a60ed1eb6cd2a"} Mar 16 00:32:44 crc kubenswrapper[4816]: I0316 00:32:44.013014 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tmfj2" event={"ID":"05447e0c-cfec-4548-a367-b4058cd9ee40","Type":"ContainerDied","Data":"788e7b9c485e2da4d4c729d6018e362f67e9125881eaf7b3347cf2e96230957c"} Mar 16 00:32:44 crc kubenswrapper[4816]: I0316 00:32:44.013032 4816 scope.go:117] "RemoveContainer" containerID="ad40cf381627305d1b58213b1985a792b55e9459828e9e50b09a60ed1eb6cd2a" Mar 16 00:32:44 crc kubenswrapper[4816]: I0316 00:32:44.014526 4816 generic.go:334] "Generic (PLEG): container finished" podID="4b2b3bef-a66e-4caa-bf69-164562b1dfd6" containerID="0366a024e1fb0871713f897d2b15717fdfe061cb2143d61741b0493ff4eab489" exitCode=0 Mar 16 00:32:44 crc kubenswrapper[4816]: I0316 00:32:44.014572 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-bundle-2-build" event={"ID":"4b2b3bef-a66e-4caa-bf69-164562b1dfd6","Type":"ContainerDied","Data":"0366a024e1fb0871713f897d2b15717fdfe061cb2143d61741b0493ff4eab489"} Mar 16 00:32:44 crc kubenswrapper[4816]: I0316 00:32:44.032739 4816 scope.go:117] "RemoveContainer" containerID="b6e0ddce6965ab602acc7ac1b54b27f26e5eabfe05171bdd286f5ad548b30ea8" Mar 16 00:32:44 crc kubenswrapper[4816]: I0316 00:32:44.075638 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-tmfj2"] Mar 16 00:32:44 crc kubenswrapper[4816]: I0316 00:32:44.077978 4816 scope.go:117] "RemoveContainer" containerID="25145667416f108acf66af8cb8050c2e3a99db03759d435f751fb8c2664712c6" Mar 16 00:32:44 crc kubenswrapper[4816]: I0316 00:32:44.080763 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-tmfj2"] Mar 16 00:32:44 crc kubenswrapper[4816]: I0316 00:32:44.104293 4816 scope.go:117] "RemoveContainer" containerID="ad40cf381627305d1b58213b1985a792b55e9459828e9e50b09a60ed1eb6cd2a" Mar 16 00:32:44 crc kubenswrapper[4816]: E0316 00:32:44.104830 4816 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ad40cf381627305d1b58213b1985a792b55e9459828e9e50b09a60ed1eb6cd2a\": container with ID starting with ad40cf381627305d1b58213b1985a792b55e9459828e9e50b09a60ed1eb6cd2a not found: ID does not exist" containerID="ad40cf381627305d1b58213b1985a792b55e9459828e9e50b09a60ed1eb6cd2a" Mar 16 00:32:44 crc kubenswrapper[4816]: I0316 00:32:44.104883 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ad40cf381627305d1b58213b1985a792b55e9459828e9e50b09a60ed1eb6cd2a"} err="failed to get container status \"ad40cf381627305d1b58213b1985a792b55e9459828e9e50b09a60ed1eb6cd2a\": rpc error: code = NotFound desc = could not find container \"ad40cf381627305d1b58213b1985a792b55e9459828e9e50b09a60ed1eb6cd2a\": container with ID starting with ad40cf381627305d1b58213b1985a792b55e9459828e9e50b09a60ed1eb6cd2a not found: ID does not exist" Mar 16 00:32:44 crc kubenswrapper[4816]: I0316 00:32:44.104909 4816 scope.go:117] "RemoveContainer" containerID="b6e0ddce6965ab602acc7ac1b54b27f26e5eabfe05171bdd286f5ad548b30ea8" Mar 16 00:32:44 crc kubenswrapper[4816]: E0316 00:32:44.105244 4816 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b6e0ddce6965ab602acc7ac1b54b27f26e5eabfe05171bdd286f5ad548b30ea8\": container with ID starting with b6e0ddce6965ab602acc7ac1b54b27f26e5eabfe05171bdd286f5ad548b30ea8 not found: ID does not exist" containerID="b6e0ddce6965ab602acc7ac1b54b27f26e5eabfe05171bdd286f5ad548b30ea8" Mar 16 00:32:44 crc kubenswrapper[4816]: I0316 00:32:44.105279 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b6e0ddce6965ab602acc7ac1b54b27f26e5eabfe05171bdd286f5ad548b30ea8"} err="failed to get container status \"b6e0ddce6965ab602acc7ac1b54b27f26e5eabfe05171bdd286f5ad548b30ea8\": rpc error: code = NotFound desc = could not find container \"b6e0ddce6965ab602acc7ac1b54b27f26e5eabfe05171bdd286f5ad548b30ea8\": container with ID starting with b6e0ddce6965ab602acc7ac1b54b27f26e5eabfe05171bdd286f5ad548b30ea8 not found: ID does not exist" Mar 16 00:32:44 crc kubenswrapper[4816]: I0316 00:32:44.105304 4816 scope.go:117] "RemoveContainer" containerID="25145667416f108acf66af8cb8050c2e3a99db03759d435f751fb8c2664712c6" Mar 16 00:32:44 crc kubenswrapper[4816]: E0316 00:32:44.105579 4816 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"25145667416f108acf66af8cb8050c2e3a99db03759d435f751fb8c2664712c6\": container with ID starting with 25145667416f108acf66af8cb8050c2e3a99db03759d435f751fb8c2664712c6 not found: ID does not exist" containerID="25145667416f108acf66af8cb8050c2e3a99db03759d435f751fb8c2664712c6" Mar 16 00:32:44 crc kubenswrapper[4816]: I0316 00:32:44.105617 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"25145667416f108acf66af8cb8050c2e3a99db03759d435f751fb8c2664712c6"} err="failed to get container status \"25145667416f108acf66af8cb8050c2e3a99db03759d435f751fb8c2664712c6\": rpc error: code = NotFound desc = could not find container \"25145667416f108acf66af8cb8050c2e3a99db03759d435f751fb8c2664712c6\": container with ID starting with 25145667416f108acf66af8cb8050c2e3a99db03759d435f751fb8c2664712c6 not found: ID does not exist" Mar 16 00:32:45 crc kubenswrapper[4816]: I0316 00:32:45.023116 4816 generic.go:334] "Generic (PLEG): container finished" podID="4b2b3bef-a66e-4caa-bf69-164562b1dfd6" containerID="d77eca03b254d697004fd4427a5501efd30125dde6d6f8186d5997a043ed31c6" exitCode=0 Mar 16 00:32:45 crc kubenswrapper[4816]: I0316 00:32:45.023178 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-bundle-2-build" event={"ID":"4b2b3bef-a66e-4caa-bf69-164562b1dfd6","Type":"ContainerDied","Data":"d77eca03b254d697004fd4427a5501efd30125dde6d6f8186d5997a043ed31c6"} Mar 16 00:32:45 crc kubenswrapper[4816]: I0316 00:32:45.067154 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_service-telemetry-operator-bundle-2-build_4b2b3bef-a66e-4caa-bf69-164562b1dfd6/manage-dockerfile/0.log" Mar 16 00:32:45 crc kubenswrapper[4816]: I0316 00:32:45.680334 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="05447e0c-cfec-4548-a367-b4058cd9ee40" path="/var/lib/kubelet/pods/05447e0c-cfec-4548-a367-b4058cd9ee40/volumes" Mar 16 00:32:46 crc kubenswrapper[4816]: I0316 00:32:46.033205 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-bundle-2-build" event={"ID":"4b2b3bef-a66e-4caa-bf69-164562b1dfd6","Type":"ContainerStarted","Data":"9220ae7cef81439d24ea0bf56c09c98a6723b9dba8abd218142c8db7d16c74ff"} Mar 16 00:32:46 crc kubenswrapper[4816]: I0316 00:32:46.059707 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/service-telemetry-operator-bundle-2-build" podStartSLOduration=5.059683912 podStartE2EDuration="5.059683912s" podCreationTimestamp="2026-03-16 00:32:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-16 00:32:46.05636657 +0000 UTC m=+1559.152666523" watchObservedRunningTime="2026-03-16 00:32:46.059683912 +0000 UTC m=+1559.155983865" Mar 16 00:32:49 crc kubenswrapper[4816]: I0316 00:32:49.054089 4816 generic.go:334] "Generic (PLEG): container finished" podID="4b2b3bef-a66e-4caa-bf69-164562b1dfd6" containerID="9220ae7cef81439d24ea0bf56c09c98a6723b9dba8abd218142c8db7d16c74ff" exitCode=0 Mar 16 00:32:49 crc kubenswrapper[4816]: I0316 00:32:49.054187 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-bundle-2-build" event={"ID":"4b2b3bef-a66e-4caa-bf69-164562b1dfd6","Type":"ContainerDied","Data":"9220ae7cef81439d24ea0bf56c09c98a6723b9dba8abd218142c8db7d16c74ff"} Mar 16 00:32:50 crc kubenswrapper[4816]: I0316 00:32:50.278515 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-operator-bundle-2-build" Mar 16 00:32:50 crc kubenswrapper[4816]: I0316 00:32:50.421251 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/4b2b3bef-a66e-4caa-bf69-164562b1dfd6-build-proxy-ca-bundles\") pod \"4b2b3bef-a66e-4caa-bf69-164562b1dfd6\" (UID: \"4b2b3bef-a66e-4caa-bf69-164562b1dfd6\") " Mar 16 00:32:50 crc kubenswrapper[4816]: I0316 00:32:50.421322 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/4b2b3bef-a66e-4caa-bf69-164562b1dfd6-buildcachedir\") pod \"4b2b3bef-a66e-4caa-bf69-164562b1dfd6\" (UID: \"4b2b3bef-a66e-4caa-bf69-164562b1dfd6\") " Mar 16 00:32:50 crc kubenswrapper[4816]: I0316 00:32:50.421369 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/4b2b3bef-a66e-4caa-bf69-164562b1dfd6-container-storage-root\") pod \"4b2b3bef-a66e-4caa-bf69-164562b1dfd6\" (UID: \"4b2b3bef-a66e-4caa-bf69-164562b1dfd6\") " Mar 16 00:32:50 crc kubenswrapper[4816]: I0316 00:32:50.421407 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/4b2b3bef-a66e-4caa-bf69-164562b1dfd6-build-system-configs\") pod \"4b2b3bef-a66e-4caa-bf69-164562b1dfd6\" (UID: \"4b2b3bef-a66e-4caa-bf69-164562b1dfd6\") " Mar 16 00:32:50 crc kubenswrapper[4816]: I0316 00:32:50.421435 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-fs5z5-push\" (UniqueName: \"kubernetes.io/secret/4b2b3bef-a66e-4caa-bf69-164562b1dfd6-builder-dockercfg-fs5z5-push\") pod \"4b2b3bef-a66e-4caa-bf69-164562b1dfd6\" (UID: \"4b2b3bef-a66e-4caa-bf69-164562b1dfd6\") " Mar 16 00:32:50 crc kubenswrapper[4816]: I0316 00:32:50.421506 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/4b2b3bef-a66e-4caa-bf69-164562b1dfd6-buildworkdir\") pod \"4b2b3bef-a66e-4caa-bf69-164562b1dfd6\" (UID: \"4b2b3bef-a66e-4caa-bf69-164562b1dfd6\") " Mar 16 00:32:50 crc kubenswrapper[4816]: I0316 00:32:50.421541 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4nqgb\" (UniqueName: \"kubernetes.io/projected/4b2b3bef-a66e-4caa-bf69-164562b1dfd6-kube-api-access-4nqgb\") pod \"4b2b3bef-a66e-4caa-bf69-164562b1dfd6\" (UID: \"4b2b3bef-a66e-4caa-bf69-164562b1dfd6\") " Mar 16 00:32:50 crc kubenswrapper[4816]: I0316 00:32:50.421600 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-fs5z5-pull\" (UniqueName: \"kubernetes.io/secret/4b2b3bef-a66e-4caa-bf69-164562b1dfd6-builder-dockercfg-fs5z5-pull\") pod \"4b2b3bef-a66e-4caa-bf69-164562b1dfd6\" (UID: \"4b2b3bef-a66e-4caa-bf69-164562b1dfd6\") " Mar 16 00:32:50 crc kubenswrapper[4816]: I0316 00:32:50.421651 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/4b2b3bef-a66e-4caa-bf69-164562b1dfd6-build-blob-cache\") pod \"4b2b3bef-a66e-4caa-bf69-164562b1dfd6\" (UID: \"4b2b3bef-a66e-4caa-bf69-164562b1dfd6\") " Mar 16 00:32:50 crc kubenswrapper[4816]: I0316 00:32:50.421675 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/4b2b3bef-a66e-4caa-bf69-164562b1dfd6-container-storage-run\") pod \"4b2b3bef-a66e-4caa-bf69-164562b1dfd6\" (UID: \"4b2b3bef-a66e-4caa-bf69-164562b1dfd6\") " Mar 16 00:32:50 crc kubenswrapper[4816]: I0316 00:32:50.421708 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/4b2b3bef-a66e-4caa-bf69-164562b1dfd6-build-ca-bundles\") pod \"4b2b3bef-a66e-4caa-bf69-164562b1dfd6\" (UID: \"4b2b3bef-a66e-4caa-bf69-164562b1dfd6\") " Mar 16 00:32:50 crc kubenswrapper[4816]: I0316 00:32:50.421750 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/4b2b3bef-a66e-4caa-bf69-164562b1dfd6-node-pullsecrets\") pod \"4b2b3bef-a66e-4caa-bf69-164562b1dfd6\" (UID: \"4b2b3bef-a66e-4caa-bf69-164562b1dfd6\") " Mar 16 00:32:50 crc kubenswrapper[4816]: I0316 00:32:50.421879 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4b2b3bef-a66e-4caa-bf69-164562b1dfd6-build-proxy-ca-bundles" (OuterVolumeSpecName: "build-proxy-ca-bundles") pod "4b2b3bef-a66e-4caa-bf69-164562b1dfd6" (UID: "4b2b3bef-a66e-4caa-bf69-164562b1dfd6"). InnerVolumeSpecName "build-proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:32:50 crc kubenswrapper[4816]: I0316 00:32:50.421943 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4b2b3bef-a66e-4caa-bf69-164562b1dfd6-node-pullsecrets" (OuterVolumeSpecName: "node-pullsecrets") pod "4b2b3bef-a66e-4caa-bf69-164562b1dfd6" (UID: "4b2b3bef-a66e-4caa-bf69-164562b1dfd6"). InnerVolumeSpecName "node-pullsecrets". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 16 00:32:50 crc kubenswrapper[4816]: I0316 00:32:50.422231 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4b2b3bef-a66e-4caa-bf69-164562b1dfd6-buildcachedir" (OuterVolumeSpecName: "buildcachedir") pod "4b2b3bef-a66e-4caa-bf69-164562b1dfd6" (UID: "4b2b3bef-a66e-4caa-bf69-164562b1dfd6"). InnerVolumeSpecName "buildcachedir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 16 00:32:50 crc kubenswrapper[4816]: I0316 00:32:50.422410 4816 reconciler_common.go:293] "Volume detached for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/4b2b3bef-a66e-4caa-bf69-164562b1dfd6-node-pullsecrets\") on node \"crc\" DevicePath \"\"" Mar 16 00:32:50 crc kubenswrapper[4816]: I0316 00:32:50.422431 4816 reconciler_common.go:293] "Volume detached for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/4b2b3bef-a66e-4caa-bf69-164562b1dfd6-build-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 16 00:32:50 crc kubenswrapper[4816]: I0316 00:32:50.422442 4816 reconciler_common.go:293] "Volume detached for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/4b2b3bef-a66e-4caa-bf69-164562b1dfd6-buildcachedir\") on node \"crc\" DevicePath \"\"" Mar 16 00:32:50 crc kubenswrapper[4816]: I0316 00:32:50.422580 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4b2b3bef-a66e-4caa-bf69-164562b1dfd6-build-ca-bundles" (OuterVolumeSpecName: "build-ca-bundles") pod "4b2b3bef-a66e-4caa-bf69-164562b1dfd6" (UID: "4b2b3bef-a66e-4caa-bf69-164562b1dfd6"). InnerVolumeSpecName "build-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:32:50 crc kubenswrapper[4816]: I0316 00:32:50.422622 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4b2b3bef-a66e-4caa-bf69-164562b1dfd6-buildworkdir" (OuterVolumeSpecName: "buildworkdir") pod "4b2b3bef-a66e-4caa-bf69-164562b1dfd6" (UID: "4b2b3bef-a66e-4caa-bf69-164562b1dfd6"). InnerVolumeSpecName "buildworkdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 16 00:32:50 crc kubenswrapper[4816]: I0316 00:32:50.422884 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4b2b3bef-a66e-4caa-bf69-164562b1dfd6-container-storage-run" (OuterVolumeSpecName: "container-storage-run") pod "4b2b3bef-a66e-4caa-bf69-164562b1dfd6" (UID: "4b2b3bef-a66e-4caa-bf69-164562b1dfd6"). InnerVolumeSpecName "container-storage-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 16 00:32:50 crc kubenswrapper[4816]: I0316 00:32:50.422943 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4b2b3bef-a66e-4caa-bf69-164562b1dfd6-build-system-configs" (OuterVolumeSpecName: "build-system-configs") pod "4b2b3bef-a66e-4caa-bf69-164562b1dfd6" (UID: "4b2b3bef-a66e-4caa-bf69-164562b1dfd6"). InnerVolumeSpecName "build-system-configs". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:32:50 crc kubenswrapper[4816]: I0316 00:32:50.426038 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4b2b3bef-a66e-4caa-bf69-164562b1dfd6-build-blob-cache" (OuterVolumeSpecName: "build-blob-cache") pod "4b2b3bef-a66e-4caa-bf69-164562b1dfd6" (UID: "4b2b3bef-a66e-4caa-bf69-164562b1dfd6"). InnerVolumeSpecName "build-blob-cache". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 16 00:32:50 crc kubenswrapper[4816]: I0316 00:32:50.428101 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4b2b3bef-a66e-4caa-bf69-164562b1dfd6-builder-dockercfg-fs5z5-pull" (OuterVolumeSpecName: "builder-dockercfg-fs5z5-pull") pod "4b2b3bef-a66e-4caa-bf69-164562b1dfd6" (UID: "4b2b3bef-a66e-4caa-bf69-164562b1dfd6"). InnerVolumeSpecName "builder-dockercfg-fs5z5-pull". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 16 00:32:50 crc kubenswrapper[4816]: I0316 00:32:50.428117 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4b2b3bef-a66e-4caa-bf69-164562b1dfd6-builder-dockercfg-fs5z5-push" (OuterVolumeSpecName: "builder-dockercfg-fs5z5-push") pod "4b2b3bef-a66e-4caa-bf69-164562b1dfd6" (UID: "4b2b3bef-a66e-4caa-bf69-164562b1dfd6"). InnerVolumeSpecName "builder-dockercfg-fs5z5-push". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 16 00:32:50 crc kubenswrapper[4816]: I0316 00:32:50.429106 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4b2b3bef-a66e-4caa-bf69-164562b1dfd6-kube-api-access-4nqgb" (OuterVolumeSpecName: "kube-api-access-4nqgb") pod "4b2b3bef-a66e-4caa-bf69-164562b1dfd6" (UID: "4b2b3bef-a66e-4caa-bf69-164562b1dfd6"). InnerVolumeSpecName "kube-api-access-4nqgb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:32:50 crc kubenswrapper[4816]: I0316 00:32:50.429819 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4b2b3bef-a66e-4caa-bf69-164562b1dfd6-container-storage-root" (OuterVolumeSpecName: "container-storage-root") pod "4b2b3bef-a66e-4caa-bf69-164562b1dfd6" (UID: "4b2b3bef-a66e-4caa-bf69-164562b1dfd6"). InnerVolumeSpecName "container-storage-root". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 16 00:32:50 crc kubenswrapper[4816]: I0316 00:32:50.523116 4816 reconciler_common.go:293] "Volume detached for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/4b2b3bef-a66e-4caa-bf69-164562b1dfd6-container-storage-root\") on node \"crc\" DevicePath \"\"" Mar 16 00:32:50 crc kubenswrapper[4816]: I0316 00:32:50.523148 4816 reconciler_common.go:293] "Volume detached for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/4b2b3bef-a66e-4caa-bf69-164562b1dfd6-build-system-configs\") on node \"crc\" DevicePath \"\"" Mar 16 00:32:50 crc kubenswrapper[4816]: I0316 00:32:50.523157 4816 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-fs5z5-push\" (UniqueName: \"kubernetes.io/secret/4b2b3bef-a66e-4caa-bf69-164562b1dfd6-builder-dockercfg-fs5z5-push\") on node \"crc\" DevicePath \"\"" Mar 16 00:32:50 crc kubenswrapper[4816]: I0316 00:32:50.523168 4816 reconciler_common.go:293] "Volume detached for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/4b2b3bef-a66e-4caa-bf69-164562b1dfd6-buildworkdir\") on node \"crc\" DevicePath \"\"" Mar 16 00:32:50 crc kubenswrapper[4816]: I0316 00:32:50.523176 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4nqgb\" (UniqueName: \"kubernetes.io/projected/4b2b3bef-a66e-4caa-bf69-164562b1dfd6-kube-api-access-4nqgb\") on node \"crc\" DevicePath \"\"" Mar 16 00:32:50 crc kubenswrapper[4816]: I0316 00:32:50.523184 4816 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-fs5z5-pull\" (UniqueName: \"kubernetes.io/secret/4b2b3bef-a66e-4caa-bf69-164562b1dfd6-builder-dockercfg-fs5z5-pull\") on node \"crc\" DevicePath \"\"" Mar 16 00:32:50 crc kubenswrapper[4816]: I0316 00:32:50.523192 4816 reconciler_common.go:293] "Volume detached for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/4b2b3bef-a66e-4caa-bf69-164562b1dfd6-build-blob-cache\") on node \"crc\" DevicePath \"\"" Mar 16 00:32:50 crc kubenswrapper[4816]: I0316 00:32:50.523235 4816 reconciler_common.go:293] "Volume detached for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/4b2b3bef-a66e-4caa-bf69-164562b1dfd6-container-storage-run\") on node \"crc\" DevicePath \"\"" Mar 16 00:32:50 crc kubenswrapper[4816]: I0316 00:32:50.523244 4816 reconciler_common.go:293] "Volume detached for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/4b2b3bef-a66e-4caa-bf69-164562b1dfd6-build-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 16 00:32:51 crc kubenswrapper[4816]: I0316 00:32:51.071604 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-bundle-2-build" event={"ID":"4b2b3bef-a66e-4caa-bf69-164562b1dfd6","Type":"ContainerDied","Data":"4e56eb9def807a83219554ca3506a988f161f2353f6b9de86cc0c12cc3ff1192"} Mar 16 00:32:51 crc kubenswrapper[4816]: I0316 00:32:51.071829 4816 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4e56eb9def807a83219554ca3506a988f161f2353f6b9de86cc0c12cc3ff1192" Mar 16 00:32:51 crc kubenswrapper[4816]: I0316 00:32:51.072232 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-operator-bundle-2-build" Mar 16 00:32:54 crc kubenswrapper[4816]: I0316 00:32:54.231027 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/smart-gateway-operator-bundle-1-build"] Mar 16 00:32:54 crc kubenswrapper[4816]: E0316 00:32:54.232828 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4b2b3bef-a66e-4caa-bf69-164562b1dfd6" containerName="git-clone" Mar 16 00:32:54 crc kubenswrapper[4816]: I0316 00:32:54.232866 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="4b2b3bef-a66e-4caa-bf69-164562b1dfd6" containerName="git-clone" Mar 16 00:32:54 crc kubenswrapper[4816]: E0316 00:32:54.232877 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="05447e0c-cfec-4548-a367-b4058cd9ee40" containerName="registry-server" Mar 16 00:32:54 crc kubenswrapper[4816]: I0316 00:32:54.232883 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="05447e0c-cfec-4548-a367-b4058cd9ee40" containerName="registry-server" Mar 16 00:32:54 crc kubenswrapper[4816]: E0316 00:32:54.232895 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="05447e0c-cfec-4548-a367-b4058cd9ee40" containerName="extract-utilities" Mar 16 00:32:54 crc kubenswrapper[4816]: I0316 00:32:54.232902 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="05447e0c-cfec-4548-a367-b4058cd9ee40" containerName="extract-utilities" Mar 16 00:32:54 crc kubenswrapper[4816]: E0316 00:32:54.232911 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4b2b3bef-a66e-4caa-bf69-164562b1dfd6" containerName="docker-build" Mar 16 00:32:54 crc kubenswrapper[4816]: I0316 00:32:54.232919 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="4b2b3bef-a66e-4caa-bf69-164562b1dfd6" containerName="docker-build" Mar 16 00:32:54 crc kubenswrapper[4816]: E0316 00:32:54.232927 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4b2b3bef-a66e-4caa-bf69-164562b1dfd6" containerName="manage-dockerfile" Mar 16 00:32:54 crc kubenswrapper[4816]: I0316 00:32:54.232934 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="4b2b3bef-a66e-4caa-bf69-164562b1dfd6" containerName="manage-dockerfile" Mar 16 00:32:54 crc kubenswrapper[4816]: E0316 00:32:54.232946 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="05447e0c-cfec-4548-a367-b4058cd9ee40" containerName="extract-content" Mar 16 00:32:54 crc kubenswrapper[4816]: I0316 00:32:54.232952 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="05447e0c-cfec-4548-a367-b4058cd9ee40" containerName="extract-content" Mar 16 00:32:54 crc kubenswrapper[4816]: I0316 00:32:54.233065 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="4b2b3bef-a66e-4caa-bf69-164562b1dfd6" containerName="docker-build" Mar 16 00:32:54 crc kubenswrapper[4816]: I0316 00:32:54.233084 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="05447e0c-cfec-4548-a367-b4058cd9ee40" containerName="registry-server" Mar 16 00:32:54 crc kubenswrapper[4816]: I0316 00:32:54.233738 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/smart-gateway-operator-bundle-1-build" Mar 16 00:32:54 crc kubenswrapper[4816]: I0316 00:32:54.236006 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"smart-gateway-operator-bundle-1-ca" Mar 16 00:32:54 crc kubenswrapper[4816]: I0316 00:32:54.236365 4816 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"builder-dockercfg-fs5z5" Mar 16 00:32:54 crc kubenswrapper[4816]: I0316 00:32:54.236810 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"smart-gateway-operator-bundle-1-global-ca" Mar 16 00:32:54 crc kubenswrapper[4816]: I0316 00:32:54.236979 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"smart-gateway-operator-bundle-1-sys-config" Mar 16 00:32:54 crc kubenswrapper[4816]: I0316 00:32:54.242310 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/smart-gateway-operator-bundle-1-build"] Mar 16 00:32:54 crc kubenswrapper[4816]: I0316 00:32:54.371858 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/7c03433b-4b96-4172-b344-de3e72b52900-build-system-configs\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"7c03433b-4b96-4172-b344-de3e72b52900\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Mar 16 00:32:54 crc kubenswrapper[4816]: I0316 00:32:54.372010 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/7c03433b-4b96-4172-b344-de3e72b52900-buildworkdir\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"7c03433b-4b96-4172-b344-de3e72b52900\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Mar 16 00:32:54 crc kubenswrapper[4816]: I0316 00:32:54.372063 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7c03433b-4b96-4172-b344-de3e72b52900-build-proxy-ca-bundles\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"7c03433b-4b96-4172-b344-de3e72b52900\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Mar 16 00:32:54 crc kubenswrapper[4816]: I0316 00:32:54.372173 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/7c03433b-4b96-4172-b344-de3e72b52900-buildcachedir\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"7c03433b-4b96-4172-b344-de3e72b52900\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Mar 16 00:32:54 crc kubenswrapper[4816]: I0316 00:32:54.372215 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-fs5z5-pull\" (UniqueName: \"kubernetes.io/secret/7c03433b-4b96-4172-b344-de3e72b52900-builder-dockercfg-fs5z5-pull\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"7c03433b-4b96-4172-b344-de3e72b52900\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Mar 16 00:32:54 crc kubenswrapper[4816]: I0316 00:32:54.372260 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-fs5z5-push\" (UniqueName: \"kubernetes.io/secret/7c03433b-4b96-4172-b344-de3e72b52900-builder-dockercfg-fs5z5-push\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"7c03433b-4b96-4172-b344-de3e72b52900\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Mar 16 00:32:54 crc kubenswrapper[4816]: I0316 00:32:54.372350 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dfr8f\" (UniqueName: \"kubernetes.io/projected/7c03433b-4b96-4172-b344-de3e72b52900-kube-api-access-dfr8f\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"7c03433b-4b96-4172-b344-de3e72b52900\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Mar 16 00:32:54 crc kubenswrapper[4816]: I0316 00:32:54.372404 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/7c03433b-4b96-4172-b344-de3e72b52900-node-pullsecrets\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"7c03433b-4b96-4172-b344-de3e72b52900\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Mar 16 00:32:54 crc kubenswrapper[4816]: I0316 00:32:54.372490 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/7c03433b-4b96-4172-b344-de3e72b52900-container-storage-run\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"7c03433b-4b96-4172-b344-de3e72b52900\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Mar 16 00:32:54 crc kubenswrapper[4816]: I0316 00:32:54.372515 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7c03433b-4b96-4172-b344-de3e72b52900-build-ca-bundles\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"7c03433b-4b96-4172-b344-de3e72b52900\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Mar 16 00:32:54 crc kubenswrapper[4816]: I0316 00:32:54.372609 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/7c03433b-4b96-4172-b344-de3e72b52900-build-blob-cache\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"7c03433b-4b96-4172-b344-de3e72b52900\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Mar 16 00:32:54 crc kubenswrapper[4816]: I0316 00:32:54.372713 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/7c03433b-4b96-4172-b344-de3e72b52900-container-storage-root\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"7c03433b-4b96-4172-b344-de3e72b52900\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Mar 16 00:32:54 crc kubenswrapper[4816]: I0316 00:32:54.473903 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/7c03433b-4b96-4172-b344-de3e72b52900-buildworkdir\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"7c03433b-4b96-4172-b344-de3e72b52900\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Mar 16 00:32:54 crc kubenswrapper[4816]: I0316 00:32:54.473955 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7c03433b-4b96-4172-b344-de3e72b52900-build-proxy-ca-bundles\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"7c03433b-4b96-4172-b344-de3e72b52900\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Mar 16 00:32:54 crc kubenswrapper[4816]: I0316 00:32:54.473994 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/7c03433b-4b96-4172-b344-de3e72b52900-buildcachedir\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"7c03433b-4b96-4172-b344-de3e72b52900\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Mar 16 00:32:54 crc kubenswrapper[4816]: I0316 00:32:54.474024 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-fs5z5-pull\" (UniqueName: \"kubernetes.io/secret/7c03433b-4b96-4172-b344-de3e72b52900-builder-dockercfg-fs5z5-pull\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"7c03433b-4b96-4172-b344-de3e72b52900\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Mar 16 00:32:54 crc kubenswrapper[4816]: I0316 00:32:54.474053 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-fs5z5-push\" (UniqueName: \"kubernetes.io/secret/7c03433b-4b96-4172-b344-de3e72b52900-builder-dockercfg-fs5z5-push\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"7c03433b-4b96-4172-b344-de3e72b52900\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Mar 16 00:32:54 crc kubenswrapper[4816]: I0316 00:32:54.474094 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dfr8f\" (UniqueName: \"kubernetes.io/projected/7c03433b-4b96-4172-b344-de3e72b52900-kube-api-access-dfr8f\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"7c03433b-4b96-4172-b344-de3e72b52900\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Mar 16 00:32:54 crc kubenswrapper[4816]: I0316 00:32:54.474124 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/7c03433b-4b96-4172-b344-de3e72b52900-node-pullsecrets\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"7c03433b-4b96-4172-b344-de3e72b52900\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Mar 16 00:32:54 crc kubenswrapper[4816]: I0316 00:32:54.474159 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/7c03433b-4b96-4172-b344-de3e72b52900-container-storage-run\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"7c03433b-4b96-4172-b344-de3e72b52900\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Mar 16 00:32:54 crc kubenswrapper[4816]: I0316 00:32:54.474182 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7c03433b-4b96-4172-b344-de3e72b52900-build-ca-bundles\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"7c03433b-4b96-4172-b344-de3e72b52900\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Mar 16 00:32:54 crc kubenswrapper[4816]: I0316 00:32:54.474214 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/7c03433b-4b96-4172-b344-de3e72b52900-build-blob-cache\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"7c03433b-4b96-4172-b344-de3e72b52900\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Mar 16 00:32:54 crc kubenswrapper[4816]: I0316 00:32:54.474258 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/7c03433b-4b96-4172-b344-de3e72b52900-container-storage-root\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"7c03433b-4b96-4172-b344-de3e72b52900\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Mar 16 00:32:54 crc kubenswrapper[4816]: I0316 00:32:54.474292 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/7c03433b-4b96-4172-b344-de3e72b52900-build-system-configs\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"7c03433b-4b96-4172-b344-de3e72b52900\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Mar 16 00:32:54 crc kubenswrapper[4816]: I0316 00:32:54.474424 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/7c03433b-4b96-4172-b344-de3e72b52900-buildworkdir\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"7c03433b-4b96-4172-b344-de3e72b52900\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Mar 16 00:32:54 crc kubenswrapper[4816]: I0316 00:32:54.474535 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/7c03433b-4b96-4172-b344-de3e72b52900-node-pullsecrets\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"7c03433b-4b96-4172-b344-de3e72b52900\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Mar 16 00:32:54 crc kubenswrapper[4816]: I0316 00:32:54.474602 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/7c03433b-4b96-4172-b344-de3e72b52900-buildcachedir\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"7c03433b-4b96-4172-b344-de3e72b52900\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Mar 16 00:32:54 crc kubenswrapper[4816]: I0316 00:32:54.474825 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7c03433b-4b96-4172-b344-de3e72b52900-build-proxy-ca-bundles\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"7c03433b-4b96-4172-b344-de3e72b52900\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Mar 16 00:32:54 crc kubenswrapper[4816]: I0316 00:32:54.474943 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/7c03433b-4b96-4172-b344-de3e72b52900-build-system-configs\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"7c03433b-4b96-4172-b344-de3e72b52900\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Mar 16 00:32:54 crc kubenswrapper[4816]: I0316 00:32:54.475738 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/7c03433b-4b96-4172-b344-de3e72b52900-build-blob-cache\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"7c03433b-4b96-4172-b344-de3e72b52900\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Mar 16 00:32:54 crc kubenswrapper[4816]: I0316 00:32:54.475763 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/7c03433b-4b96-4172-b344-de3e72b52900-container-storage-run\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"7c03433b-4b96-4172-b344-de3e72b52900\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Mar 16 00:32:54 crc kubenswrapper[4816]: I0316 00:32:54.475757 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7c03433b-4b96-4172-b344-de3e72b52900-build-ca-bundles\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"7c03433b-4b96-4172-b344-de3e72b52900\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Mar 16 00:32:54 crc kubenswrapper[4816]: I0316 00:32:54.476152 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/7c03433b-4b96-4172-b344-de3e72b52900-container-storage-root\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"7c03433b-4b96-4172-b344-de3e72b52900\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Mar 16 00:32:54 crc kubenswrapper[4816]: I0316 00:32:54.481103 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-fs5z5-pull\" (UniqueName: \"kubernetes.io/secret/7c03433b-4b96-4172-b344-de3e72b52900-builder-dockercfg-fs5z5-pull\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"7c03433b-4b96-4172-b344-de3e72b52900\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Mar 16 00:32:54 crc kubenswrapper[4816]: I0316 00:32:54.485943 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-fs5z5-push\" (UniqueName: \"kubernetes.io/secret/7c03433b-4b96-4172-b344-de3e72b52900-builder-dockercfg-fs5z5-push\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"7c03433b-4b96-4172-b344-de3e72b52900\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Mar 16 00:32:54 crc kubenswrapper[4816]: I0316 00:32:54.491403 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dfr8f\" (UniqueName: \"kubernetes.io/projected/7c03433b-4b96-4172-b344-de3e72b52900-kube-api-access-dfr8f\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"7c03433b-4b96-4172-b344-de3e72b52900\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Mar 16 00:32:54 crc kubenswrapper[4816]: I0316 00:32:54.549336 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/smart-gateway-operator-bundle-1-build" Mar 16 00:32:54 crc kubenswrapper[4816]: I0316 00:32:54.750986 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/smart-gateway-operator-bundle-1-build"] Mar 16 00:32:55 crc kubenswrapper[4816]: I0316 00:32:55.109617 4816 generic.go:334] "Generic (PLEG): container finished" podID="7c03433b-4b96-4172-b344-de3e72b52900" containerID="e30d5325e2ed5b1448d06f7c2b9b149b118aecd27db13855743e289187e36f13" exitCode=0 Mar 16 00:32:55 crc kubenswrapper[4816]: I0316 00:32:55.109670 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/smart-gateway-operator-bundle-1-build" event={"ID":"7c03433b-4b96-4172-b344-de3e72b52900","Type":"ContainerDied","Data":"e30d5325e2ed5b1448d06f7c2b9b149b118aecd27db13855743e289187e36f13"} Mar 16 00:32:55 crc kubenswrapper[4816]: I0316 00:32:55.109699 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/smart-gateway-operator-bundle-1-build" event={"ID":"7c03433b-4b96-4172-b344-de3e72b52900","Type":"ContainerStarted","Data":"68913983495e9d0f1ddcf3da8b311626b33a5d1edfe1be51675b53ba41003d13"} Mar 16 00:32:56 crc kubenswrapper[4816]: I0316 00:32:56.122198 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_smart-gateway-operator-bundle-1-build_7c03433b-4b96-4172-b344-de3e72b52900/docker-build/0.log" Mar 16 00:32:56 crc kubenswrapper[4816]: I0316 00:32:56.123072 4816 generic.go:334] "Generic (PLEG): container finished" podID="7c03433b-4b96-4172-b344-de3e72b52900" containerID="e4439cd35f13a68a04a6b45eaa00f3aeec10afe4bbea233d056d60648e32f1e4" exitCode=1 Mar 16 00:32:56 crc kubenswrapper[4816]: I0316 00:32:56.123118 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/smart-gateway-operator-bundle-1-build" event={"ID":"7c03433b-4b96-4172-b344-de3e72b52900","Type":"ContainerDied","Data":"e4439cd35f13a68a04a6b45eaa00f3aeec10afe4bbea233d056d60648e32f1e4"} Mar 16 00:32:57 crc kubenswrapper[4816]: I0316 00:32:57.352915 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_smart-gateway-operator-bundle-1-build_7c03433b-4b96-4172-b344-de3e72b52900/docker-build/0.log" Mar 16 00:32:57 crc kubenswrapper[4816]: I0316 00:32:57.353726 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/smart-gateway-operator-bundle-1-build" Mar 16 00:32:57 crc kubenswrapper[4816]: I0316 00:32:57.469112 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/7c03433b-4b96-4172-b344-de3e72b52900-container-storage-root\") pod \"7c03433b-4b96-4172-b344-de3e72b52900\" (UID: \"7c03433b-4b96-4172-b344-de3e72b52900\") " Mar 16 00:32:57 crc kubenswrapper[4816]: I0316 00:32:57.469199 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dfr8f\" (UniqueName: \"kubernetes.io/projected/7c03433b-4b96-4172-b344-de3e72b52900-kube-api-access-dfr8f\") pod \"7c03433b-4b96-4172-b344-de3e72b52900\" (UID: \"7c03433b-4b96-4172-b344-de3e72b52900\") " Mar 16 00:32:57 crc kubenswrapper[4816]: I0316 00:32:57.469273 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/7c03433b-4b96-4172-b344-de3e72b52900-build-system-configs\") pod \"7c03433b-4b96-4172-b344-de3e72b52900\" (UID: \"7c03433b-4b96-4172-b344-de3e72b52900\") " Mar 16 00:32:57 crc kubenswrapper[4816]: I0316 00:32:57.469327 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-fs5z5-push\" (UniqueName: \"kubernetes.io/secret/7c03433b-4b96-4172-b344-de3e72b52900-builder-dockercfg-fs5z5-push\") pod \"7c03433b-4b96-4172-b344-de3e72b52900\" (UID: \"7c03433b-4b96-4172-b344-de3e72b52900\") " Mar 16 00:32:57 crc kubenswrapper[4816]: I0316 00:32:57.469379 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/7c03433b-4b96-4172-b344-de3e72b52900-buildworkdir\") pod \"7c03433b-4b96-4172-b344-de3e72b52900\" (UID: \"7c03433b-4b96-4172-b344-de3e72b52900\") " Mar 16 00:32:57 crc kubenswrapper[4816]: I0316 00:32:57.469408 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/7c03433b-4b96-4172-b344-de3e72b52900-buildcachedir\") pod \"7c03433b-4b96-4172-b344-de3e72b52900\" (UID: \"7c03433b-4b96-4172-b344-de3e72b52900\") " Mar 16 00:32:57 crc kubenswrapper[4816]: I0316 00:32:57.469436 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/7c03433b-4b96-4172-b344-de3e72b52900-node-pullsecrets\") pod \"7c03433b-4b96-4172-b344-de3e72b52900\" (UID: \"7c03433b-4b96-4172-b344-de3e72b52900\") " Mar 16 00:32:57 crc kubenswrapper[4816]: I0316 00:32:57.469496 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/7c03433b-4b96-4172-b344-de3e72b52900-build-blob-cache\") pod \"7c03433b-4b96-4172-b344-de3e72b52900\" (UID: \"7c03433b-4b96-4172-b344-de3e72b52900\") " Mar 16 00:32:57 crc kubenswrapper[4816]: I0316 00:32:57.469530 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-fs5z5-pull\" (UniqueName: \"kubernetes.io/secret/7c03433b-4b96-4172-b344-de3e72b52900-builder-dockercfg-fs5z5-pull\") pod \"7c03433b-4b96-4172-b344-de3e72b52900\" (UID: \"7c03433b-4b96-4172-b344-de3e72b52900\") " Mar 16 00:32:57 crc kubenswrapper[4816]: I0316 00:32:57.469592 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7c03433b-4b96-4172-b344-de3e72b52900-buildcachedir" (OuterVolumeSpecName: "buildcachedir") pod "7c03433b-4b96-4172-b344-de3e72b52900" (UID: "7c03433b-4b96-4172-b344-de3e72b52900"). InnerVolumeSpecName "buildcachedir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 16 00:32:57 crc kubenswrapper[4816]: I0316 00:32:57.469650 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/7c03433b-4b96-4172-b344-de3e72b52900-container-storage-run\") pod \"7c03433b-4b96-4172-b344-de3e72b52900\" (UID: \"7c03433b-4b96-4172-b344-de3e72b52900\") " Mar 16 00:32:57 crc kubenswrapper[4816]: I0316 00:32:57.469699 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7c03433b-4b96-4172-b344-de3e72b52900-build-proxy-ca-bundles\") pod \"7c03433b-4b96-4172-b344-de3e72b52900\" (UID: \"7c03433b-4b96-4172-b344-de3e72b52900\") " Mar 16 00:32:57 crc kubenswrapper[4816]: I0316 00:32:57.469730 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7c03433b-4b96-4172-b344-de3e72b52900-build-ca-bundles\") pod \"7c03433b-4b96-4172-b344-de3e72b52900\" (UID: \"7c03433b-4b96-4172-b344-de3e72b52900\") " Mar 16 00:32:57 crc kubenswrapper[4816]: I0316 00:32:57.469760 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7c03433b-4b96-4172-b344-de3e72b52900-node-pullsecrets" (OuterVolumeSpecName: "node-pullsecrets") pod "7c03433b-4b96-4172-b344-de3e72b52900" (UID: "7c03433b-4b96-4172-b344-de3e72b52900"). InnerVolumeSpecName "node-pullsecrets". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 16 00:32:57 crc kubenswrapper[4816]: I0316 00:32:57.470004 4816 reconciler_common.go:293] "Volume detached for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/7c03433b-4b96-4172-b344-de3e72b52900-buildcachedir\") on node \"crc\" DevicePath \"\"" Mar 16 00:32:57 crc kubenswrapper[4816]: I0316 00:32:57.470019 4816 reconciler_common.go:293] "Volume detached for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/7c03433b-4b96-4172-b344-de3e72b52900-node-pullsecrets\") on node \"crc\" DevicePath \"\"" Mar 16 00:32:57 crc kubenswrapper[4816]: I0316 00:32:57.470735 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7c03433b-4b96-4172-b344-de3e72b52900-container-storage-root" (OuterVolumeSpecName: "container-storage-root") pod "7c03433b-4b96-4172-b344-de3e72b52900" (UID: "7c03433b-4b96-4172-b344-de3e72b52900"). InnerVolumeSpecName "container-storage-root". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 16 00:32:57 crc kubenswrapper[4816]: I0316 00:32:57.471116 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7c03433b-4b96-4172-b344-de3e72b52900-buildworkdir" (OuterVolumeSpecName: "buildworkdir") pod "7c03433b-4b96-4172-b344-de3e72b52900" (UID: "7c03433b-4b96-4172-b344-de3e72b52900"). InnerVolumeSpecName "buildworkdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 16 00:32:57 crc kubenswrapper[4816]: I0316 00:32:57.471248 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7c03433b-4b96-4172-b344-de3e72b52900-build-system-configs" (OuterVolumeSpecName: "build-system-configs") pod "7c03433b-4b96-4172-b344-de3e72b52900" (UID: "7c03433b-4b96-4172-b344-de3e72b52900"). InnerVolumeSpecName "build-system-configs". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:32:57 crc kubenswrapper[4816]: I0316 00:32:57.472739 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7c03433b-4b96-4172-b344-de3e72b52900-build-blob-cache" (OuterVolumeSpecName: "build-blob-cache") pod "7c03433b-4b96-4172-b344-de3e72b52900" (UID: "7c03433b-4b96-4172-b344-de3e72b52900"). InnerVolumeSpecName "build-blob-cache". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 16 00:32:57 crc kubenswrapper[4816]: I0316 00:32:57.473166 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7c03433b-4b96-4172-b344-de3e72b52900-build-proxy-ca-bundles" (OuterVolumeSpecName: "build-proxy-ca-bundles") pod "7c03433b-4b96-4172-b344-de3e72b52900" (UID: "7c03433b-4b96-4172-b344-de3e72b52900"). InnerVolumeSpecName "build-proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:32:57 crc kubenswrapper[4816]: I0316 00:32:57.473448 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7c03433b-4b96-4172-b344-de3e72b52900-build-ca-bundles" (OuterVolumeSpecName: "build-ca-bundles") pod "7c03433b-4b96-4172-b344-de3e72b52900" (UID: "7c03433b-4b96-4172-b344-de3e72b52900"). InnerVolumeSpecName "build-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:32:57 crc kubenswrapper[4816]: I0316 00:32:57.473931 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7c03433b-4b96-4172-b344-de3e72b52900-container-storage-run" (OuterVolumeSpecName: "container-storage-run") pod "7c03433b-4b96-4172-b344-de3e72b52900" (UID: "7c03433b-4b96-4172-b344-de3e72b52900"). InnerVolumeSpecName "container-storage-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 16 00:32:57 crc kubenswrapper[4816]: I0316 00:32:57.477960 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7c03433b-4b96-4172-b344-de3e72b52900-kube-api-access-dfr8f" (OuterVolumeSpecName: "kube-api-access-dfr8f") pod "7c03433b-4b96-4172-b344-de3e72b52900" (UID: "7c03433b-4b96-4172-b344-de3e72b52900"). InnerVolumeSpecName "kube-api-access-dfr8f". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:32:57 crc kubenswrapper[4816]: I0316 00:32:57.477938 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7c03433b-4b96-4172-b344-de3e72b52900-builder-dockercfg-fs5z5-pull" (OuterVolumeSpecName: "builder-dockercfg-fs5z5-pull") pod "7c03433b-4b96-4172-b344-de3e72b52900" (UID: "7c03433b-4b96-4172-b344-de3e72b52900"). InnerVolumeSpecName "builder-dockercfg-fs5z5-pull". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 16 00:32:57 crc kubenswrapper[4816]: I0316 00:32:57.478226 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7c03433b-4b96-4172-b344-de3e72b52900-builder-dockercfg-fs5z5-push" (OuterVolumeSpecName: "builder-dockercfg-fs5z5-push") pod "7c03433b-4b96-4172-b344-de3e72b52900" (UID: "7c03433b-4b96-4172-b344-de3e72b52900"). InnerVolumeSpecName "builder-dockercfg-fs5z5-push". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 16 00:32:57 crc kubenswrapper[4816]: I0316 00:32:57.571428 4816 reconciler_common.go:293] "Volume detached for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/7c03433b-4b96-4172-b344-de3e72b52900-buildworkdir\") on node \"crc\" DevicePath \"\"" Mar 16 00:32:57 crc kubenswrapper[4816]: I0316 00:32:57.571471 4816 reconciler_common.go:293] "Volume detached for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/7c03433b-4b96-4172-b344-de3e72b52900-build-blob-cache\") on node \"crc\" DevicePath \"\"" Mar 16 00:32:57 crc kubenswrapper[4816]: I0316 00:32:57.571486 4816 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-fs5z5-pull\" (UniqueName: \"kubernetes.io/secret/7c03433b-4b96-4172-b344-de3e72b52900-builder-dockercfg-fs5z5-pull\") on node \"crc\" DevicePath \"\"" Mar 16 00:32:57 crc kubenswrapper[4816]: I0316 00:32:57.571502 4816 reconciler_common.go:293] "Volume detached for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/7c03433b-4b96-4172-b344-de3e72b52900-container-storage-run\") on node \"crc\" DevicePath \"\"" Mar 16 00:32:57 crc kubenswrapper[4816]: I0316 00:32:57.571514 4816 reconciler_common.go:293] "Volume detached for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7c03433b-4b96-4172-b344-de3e72b52900-build-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 16 00:32:57 crc kubenswrapper[4816]: I0316 00:32:57.571527 4816 reconciler_common.go:293] "Volume detached for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7c03433b-4b96-4172-b344-de3e72b52900-build-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 16 00:32:57 crc kubenswrapper[4816]: I0316 00:32:57.571539 4816 reconciler_common.go:293] "Volume detached for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/7c03433b-4b96-4172-b344-de3e72b52900-container-storage-root\") on node \"crc\" DevicePath \"\"" Mar 16 00:32:57 crc kubenswrapper[4816]: I0316 00:32:57.571573 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dfr8f\" (UniqueName: \"kubernetes.io/projected/7c03433b-4b96-4172-b344-de3e72b52900-kube-api-access-dfr8f\") on node \"crc\" DevicePath \"\"" Mar 16 00:32:57 crc kubenswrapper[4816]: I0316 00:32:57.571587 4816 reconciler_common.go:293] "Volume detached for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/7c03433b-4b96-4172-b344-de3e72b52900-build-system-configs\") on node \"crc\" DevicePath \"\"" Mar 16 00:32:57 crc kubenswrapper[4816]: I0316 00:32:57.571599 4816 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-fs5z5-push\" (UniqueName: \"kubernetes.io/secret/7c03433b-4b96-4172-b344-de3e72b52900-builder-dockercfg-fs5z5-push\") on node \"crc\" DevicePath \"\"" Mar 16 00:32:58 crc kubenswrapper[4816]: I0316 00:32:58.140807 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_smart-gateway-operator-bundle-1-build_7c03433b-4b96-4172-b344-de3e72b52900/docker-build/0.log" Mar 16 00:32:58 crc kubenswrapper[4816]: I0316 00:32:58.141132 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/smart-gateway-operator-bundle-1-build" event={"ID":"7c03433b-4b96-4172-b344-de3e72b52900","Type":"ContainerDied","Data":"68913983495e9d0f1ddcf3da8b311626b33a5d1edfe1be51675b53ba41003d13"} Mar 16 00:32:58 crc kubenswrapper[4816]: I0316 00:32:58.141165 4816 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="68913983495e9d0f1ddcf3da8b311626b33a5d1edfe1be51675b53ba41003d13" Mar 16 00:32:58 crc kubenswrapper[4816]: I0316 00:32:58.141201 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/smart-gateway-operator-bundle-1-build" Mar 16 00:33:01 crc kubenswrapper[4816]: I0316 00:33:01.863125 4816 patch_prober.go:28] interesting pod/machine-config-daemon-jrdcz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 16 00:33:01 crc kubenswrapper[4816]: I0316 00:33:01.863472 4816 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jrdcz" podUID="dd08ece2-7636-4966-973a-e96a34b70b53" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 16 00:33:01 crc kubenswrapper[4816]: I0316 00:33:01.863521 4816 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-jrdcz" Mar 16 00:33:01 crc kubenswrapper[4816]: I0316 00:33:01.864288 4816 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"92fd160da980a35a692640a98195800839c4f80b2447586e89c2230217ad0071"} pod="openshift-machine-config-operator/machine-config-daemon-jrdcz" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 16 00:33:01 crc kubenswrapper[4816]: I0316 00:33:01.864369 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-jrdcz" podUID="dd08ece2-7636-4966-973a-e96a34b70b53" containerName="machine-config-daemon" containerID="cri-o://92fd160da980a35a692640a98195800839c4f80b2447586e89c2230217ad0071" gracePeriod=600 Mar 16 00:33:02 crc kubenswrapper[4816]: I0316 00:33:02.167396 4816 generic.go:334] "Generic (PLEG): container finished" podID="dd08ece2-7636-4966-973a-e96a34b70b53" containerID="92fd160da980a35a692640a98195800839c4f80b2447586e89c2230217ad0071" exitCode=0 Mar 16 00:33:02 crc kubenswrapper[4816]: I0316 00:33:02.167474 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-jrdcz" event={"ID":"dd08ece2-7636-4966-973a-e96a34b70b53","Type":"ContainerDied","Data":"92fd160da980a35a692640a98195800839c4f80b2447586e89c2230217ad0071"} Mar 16 00:33:02 crc kubenswrapper[4816]: I0316 00:33:02.167737 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-jrdcz" event={"ID":"dd08ece2-7636-4966-973a-e96a34b70b53","Type":"ContainerStarted","Data":"ad2d141490ffe2b1f91c3a2296efeb65e936382d1300bb9398efe83e779ccd8a"} Mar 16 00:33:02 crc kubenswrapper[4816]: I0316 00:33:02.167757 4816 scope.go:117] "RemoveContainer" containerID="4dae7771bcc5c45d3db6bc1014246492c003743ca85668bad7e04528051cc6bc" Mar 16 00:33:04 crc kubenswrapper[4816]: I0316 00:33:04.442570 4816 scope.go:117] "RemoveContainer" containerID="0b8b2a24c4f32aff091a974cc84de6242e724aacb4bfa1cc19578627d86a25d5" Mar 16 00:33:04 crc kubenswrapper[4816]: I0316 00:33:04.724595 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["service-telemetry/smart-gateway-operator-bundle-1-build"] Mar 16 00:33:04 crc kubenswrapper[4816]: I0316 00:33:04.732641 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["service-telemetry/smart-gateway-operator-bundle-1-build"] Mar 16 00:33:05 crc kubenswrapper[4816]: I0316 00:33:05.676537 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7c03433b-4b96-4172-b344-de3e72b52900" path="/var/lib/kubelet/pods/7c03433b-4b96-4172-b344-de3e72b52900/volumes" Mar 16 00:33:06 crc kubenswrapper[4816]: I0316 00:33:06.708831 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/smart-gateway-operator-bundle-2-build"] Mar 16 00:33:06 crc kubenswrapper[4816]: E0316 00:33:06.709137 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7c03433b-4b96-4172-b344-de3e72b52900" containerName="docker-build" Mar 16 00:33:06 crc kubenswrapper[4816]: I0316 00:33:06.709152 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="7c03433b-4b96-4172-b344-de3e72b52900" containerName="docker-build" Mar 16 00:33:06 crc kubenswrapper[4816]: E0316 00:33:06.709169 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7c03433b-4b96-4172-b344-de3e72b52900" containerName="manage-dockerfile" Mar 16 00:33:06 crc kubenswrapper[4816]: I0316 00:33:06.709177 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="7c03433b-4b96-4172-b344-de3e72b52900" containerName="manage-dockerfile" Mar 16 00:33:06 crc kubenswrapper[4816]: I0316 00:33:06.709332 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="7c03433b-4b96-4172-b344-de3e72b52900" containerName="docker-build" Mar 16 00:33:06 crc kubenswrapper[4816]: I0316 00:33:06.710447 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/smart-gateway-operator-bundle-2-build" Mar 16 00:33:06 crc kubenswrapper[4816]: I0316 00:33:06.716327 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"smart-gateway-operator-bundle-2-sys-config" Mar 16 00:33:06 crc kubenswrapper[4816]: I0316 00:33:06.716788 4816 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"builder-dockercfg-fs5z5" Mar 16 00:33:06 crc kubenswrapper[4816]: I0316 00:33:06.716949 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"smart-gateway-operator-bundle-2-ca" Mar 16 00:33:06 crc kubenswrapper[4816]: I0316 00:33:06.717074 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"smart-gateway-operator-bundle-2-global-ca" Mar 16 00:33:06 crc kubenswrapper[4816]: I0316 00:33:06.724449 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/smart-gateway-operator-bundle-2-build"] Mar 16 00:33:06 crc kubenswrapper[4816]: I0316 00:33:06.907508 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/4ac9b18d-8362-488c-a816-c85899c4aa6e-node-pullsecrets\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"4ac9b18d-8362-488c-a816-c85899c4aa6e\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Mar 16 00:33:06 crc kubenswrapper[4816]: I0316 00:33:06.907745 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/4ac9b18d-8362-488c-a816-c85899c4aa6e-build-ca-bundles\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"4ac9b18d-8362-488c-a816-c85899c4aa6e\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Mar 16 00:33:06 crc kubenswrapper[4816]: I0316 00:33:06.907860 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/4ac9b18d-8362-488c-a816-c85899c4aa6e-build-blob-cache\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"4ac9b18d-8362-488c-a816-c85899c4aa6e\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Mar 16 00:33:06 crc kubenswrapper[4816]: I0316 00:33:06.907934 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-fs5z5-pull\" (UniqueName: \"kubernetes.io/secret/4ac9b18d-8362-488c-a816-c85899c4aa6e-builder-dockercfg-fs5z5-pull\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"4ac9b18d-8362-488c-a816-c85899c4aa6e\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Mar 16 00:33:06 crc kubenswrapper[4816]: I0316 00:33:06.908023 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-fs5z5-push\" (UniqueName: \"kubernetes.io/secret/4ac9b18d-8362-488c-a816-c85899c4aa6e-builder-dockercfg-fs5z5-push\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"4ac9b18d-8362-488c-a816-c85899c4aa6e\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Mar 16 00:33:06 crc kubenswrapper[4816]: I0316 00:33:06.908124 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/4ac9b18d-8362-488c-a816-c85899c4aa6e-buildworkdir\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"4ac9b18d-8362-488c-a816-c85899c4aa6e\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Mar 16 00:33:06 crc kubenswrapper[4816]: I0316 00:33:06.908241 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/4ac9b18d-8362-488c-a816-c85899c4aa6e-build-system-configs\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"4ac9b18d-8362-488c-a816-c85899c4aa6e\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Mar 16 00:33:06 crc kubenswrapper[4816]: I0316 00:33:06.908377 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/4ac9b18d-8362-488c-a816-c85899c4aa6e-buildcachedir\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"4ac9b18d-8362-488c-a816-c85899c4aa6e\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Mar 16 00:33:06 crc kubenswrapper[4816]: I0316 00:33:06.908469 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/4ac9b18d-8362-488c-a816-c85899c4aa6e-container-storage-root\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"4ac9b18d-8362-488c-a816-c85899c4aa6e\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Mar 16 00:33:06 crc kubenswrapper[4816]: I0316 00:33:06.908576 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/4ac9b18d-8362-488c-a816-c85899c4aa6e-container-storage-run\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"4ac9b18d-8362-488c-a816-c85899c4aa6e\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Mar 16 00:33:06 crc kubenswrapper[4816]: I0316 00:33:06.908981 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xjlvd\" (UniqueName: \"kubernetes.io/projected/4ac9b18d-8362-488c-a816-c85899c4aa6e-kube-api-access-xjlvd\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"4ac9b18d-8362-488c-a816-c85899c4aa6e\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Mar 16 00:33:06 crc kubenswrapper[4816]: I0316 00:33:06.909159 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/4ac9b18d-8362-488c-a816-c85899c4aa6e-build-proxy-ca-bundles\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"4ac9b18d-8362-488c-a816-c85899c4aa6e\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Mar 16 00:33:07 crc kubenswrapper[4816]: I0316 00:33:07.010926 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/4ac9b18d-8362-488c-a816-c85899c4aa6e-node-pullsecrets\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"4ac9b18d-8362-488c-a816-c85899c4aa6e\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Mar 16 00:33:07 crc kubenswrapper[4816]: I0316 00:33:07.010978 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/4ac9b18d-8362-488c-a816-c85899c4aa6e-build-ca-bundles\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"4ac9b18d-8362-488c-a816-c85899c4aa6e\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Mar 16 00:33:07 crc kubenswrapper[4816]: I0316 00:33:07.010996 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/4ac9b18d-8362-488c-a816-c85899c4aa6e-build-blob-cache\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"4ac9b18d-8362-488c-a816-c85899c4aa6e\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Mar 16 00:33:07 crc kubenswrapper[4816]: I0316 00:33:07.011013 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-fs5z5-pull\" (UniqueName: \"kubernetes.io/secret/4ac9b18d-8362-488c-a816-c85899c4aa6e-builder-dockercfg-fs5z5-pull\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"4ac9b18d-8362-488c-a816-c85899c4aa6e\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Mar 16 00:33:07 crc kubenswrapper[4816]: I0316 00:33:07.011041 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-fs5z5-push\" (UniqueName: \"kubernetes.io/secret/4ac9b18d-8362-488c-a816-c85899c4aa6e-builder-dockercfg-fs5z5-push\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"4ac9b18d-8362-488c-a816-c85899c4aa6e\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Mar 16 00:33:07 crc kubenswrapper[4816]: I0316 00:33:07.011067 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/4ac9b18d-8362-488c-a816-c85899c4aa6e-buildworkdir\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"4ac9b18d-8362-488c-a816-c85899c4aa6e\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Mar 16 00:33:07 crc kubenswrapper[4816]: I0316 00:33:07.011095 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/4ac9b18d-8362-488c-a816-c85899c4aa6e-build-system-configs\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"4ac9b18d-8362-488c-a816-c85899c4aa6e\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Mar 16 00:33:07 crc kubenswrapper[4816]: I0316 00:33:07.011113 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/4ac9b18d-8362-488c-a816-c85899c4aa6e-buildcachedir\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"4ac9b18d-8362-488c-a816-c85899c4aa6e\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Mar 16 00:33:07 crc kubenswrapper[4816]: I0316 00:33:07.011128 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/4ac9b18d-8362-488c-a816-c85899c4aa6e-container-storage-root\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"4ac9b18d-8362-488c-a816-c85899c4aa6e\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Mar 16 00:33:07 crc kubenswrapper[4816]: I0316 00:33:07.011142 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/4ac9b18d-8362-488c-a816-c85899c4aa6e-container-storage-run\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"4ac9b18d-8362-488c-a816-c85899c4aa6e\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Mar 16 00:33:07 crc kubenswrapper[4816]: I0316 00:33:07.011170 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xjlvd\" (UniqueName: \"kubernetes.io/projected/4ac9b18d-8362-488c-a816-c85899c4aa6e-kube-api-access-xjlvd\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"4ac9b18d-8362-488c-a816-c85899c4aa6e\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Mar 16 00:33:07 crc kubenswrapper[4816]: I0316 00:33:07.011214 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/4ac9b18d-8362-488c-a816-c85899c4aa6e-build-proxy-ca-bundles\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"4ac9b18d-8362-488c-a816-c85899c4aa6e\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Mar 16 00:33:07 crc kubenswrapper[4816]: I0316 00:33:07.011220 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/4ac9b18d-8362-488c-a816-c85899c4aa6e-node-pullsecrets\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"4ac9b18d-8362-488c-a816-c85899c4aa6e\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Mar 16 00:33:07 crc kubenswrapper[4816]: I0316 00:33:07.011667 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/4ac9b18d-8362-488c-a816-c85899c4aa6e-buildworkdir\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"4ac9b18d-8362-488c-a816-c85899c4aa6e\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Mar 16 00:33:07 crc kubenswrapper[4816]: I0316 00:33:07.011679 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/4ac9b18d-8362-488c-a816-c85899c4aa6e-build-blob-cache\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"4ac9b18d-8362-488c-a816-c85899c4aa6e\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Mar 16 00:33:07 crc kubenswrapper[4816]: I0316 00:33:07.011928 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/4ac9b18d-8362-488c-a816-c85899c4aa6e-container-storage-root\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"4ac9b18d-8362-488c-a816-c85899c4aa6e\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Mar 16 00:33:07 crc kubenswrapper[4816]: I0316 00:33:07.011988 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/4ac9b18d-8362-488c-a816-c85899c4aa6e-buildcachedir\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"4ac9b18d-8362-488c-a816-c85899c4aa6e\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Mar 16 00:33:07 crc kubenswrapper[4816]: I0316 00:33:07.012103 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/4ac9b18d-8362-488c-a816-c85899c4aa6e-build-proxy-ca-bundles\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"4ac9b18d-8362-488c-a816-c85899c4aa6e\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Mar 16 00:33:07 crc kubenswrapper[4816]: I0316 00:33:07.012107 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/4ac9b18d-8362-488c-a816-c85899c4aa6e-build-system-configs\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"4ac9b18d-8362-488c-a816-c85899c4aa6e\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Mar 16 00:33:07 crc kubenswrapper[4816]: I0316 00:33:07.012193 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/4ac9b18d-8362-488c-a816-c85899c4aa6e-container-storage-run\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"4ac9b18d-8362-488c-a816-c85899c4aa6e\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Mar 16 00:33:07 crc kubenswrapper[4816]: I0316 00:33:07.012341 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/4ac9b18d-8362-488c-a816-c85899c4aa6e-build-ca-bundles\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"4ac9b18d-8362-488c-a816-c85899c4aa6e\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Mar 16 00:33:07 crc kubenswrapper[4816]: I0316 00:33:07.017056 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-fs5z5-push\" (UniqueName: \"kubernetes.io/secret/4ac9b18d-8362-488c-a816-c85899c4aa6e-builder-dockercfg-fs5z5-push\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"4ac9b18d-8362-488c-a816-c85899c4aa6e\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Mar 16 00:33:07 crc kubenswrapper[4816]: I0316 00:33:07.017326 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-fs5z5-pull\" (UniqueName: \"kubernetes.io/secret/4ac9b18d-8362-488c-a816-c85899c4aa6e-builder-dockercfg-fs5z5-pull\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"4ac9b18d-8362-488c-a816-c85899c4aa6e\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Mar 16 00:33:07 crc kubenswrapper[4816]: I0316 00:33:07.029214 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xjlvd\" (UniqueName: \"kubernetes.io/projected/4ac9b18d-8362-488c-a816-c85899c4aa6e-kube-api-access-xjlvd\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"4ac9b18d-8362-488c-a816-c85899c4aa6e\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Mar 16 00:33:07 crc kubenswrapper[4816]: I0316 00:33:07.071967 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/smart-gateway-operator-bundle-2-build" Mar 16 00:33:07 crc kubenswrapper[4816]: I0316 00:33:07.274793 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/smart-gateway-operator-bundle-2-build"] Mar 16 00:33:08 crc kubenswrapper[4816]: I0316 00:33:08.215121 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/smart-gateway-operator-bundle-2-build" event={"ID":"4ac9b18d-8362-488c-a816-c85899c4aa6e","Type":"ContainerStarted","Data":"72a8749a1dbd2bdcde0b4e49f8bc79349f100d9e298284d8ed3050dbf4e9a676"} Mar 16 00:33:08 crc kubenswrapper[4816]: I0316 00:33:08.215181 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/smart-gateway-operator-bundle-2-build" event={"ID":"4ac9b18d-8362-488c-a816-c85899c4aa6e","Type":"ContainerStarted","Data":"b6c00ab1c06d051f1145bfff23f81747ff821693325bc2d3f63912c28c948e1c"} Mar 16 00:33:09 crc kubenswrapper[4816]: I0316 00:33:09.223907 4816 generic.go:334] "Generic (PLEG): container finished" podID="4ac9b18d-8362-488c-a816-c85899c4aa6e" containerID="72a8749a1dbd2bdcde0b4e49f8bc79349f100d9e298284d8ed3050dbf4e9a676" exitCode=0 Mar 16 00:33:09 crc kubenswrapper[4816]: I0316 00:33:09.224003 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/smart-gateway-operator-bundle-2-build" event={"ID":"4ac9b18d-8362-488c-a816-c85899c4aa6e","Type":"ContainerDied","Data":"72a8749a1dbd2bdcde0b4e49f8bc79349f100d9e298284d8ed3050dbf4e9a676"} Mar 16 00:33:10 crc kubenswrapper[4816]: I0316 00:33:10.232506 4816 generic.go:334] "Generic (PLEG): container finished" podID="4ac9b18d-8362-488c-a816-c85899c4aa6e" containerID="6b82c27233585220fc0a8e3b4927009ca2ab8e3e309563a177d52d0653f2ae10" exitCode=0 Mar 16 00:33:10 crc kubenswrapper[4816]: I0316 00:33:10.232596 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/smart-gateway-operator-bundle-2-build" event={"ID":"4ac9b18d-8362-488c-a816-c85899c4aa6e","Type":"ContainerDied","Data":"6b82c27233585220fc0a8e3b4927009ca2ab8e3e309563a177d52d0653f2ae10"} Mar 16 00:33:10 crc kubenswrapper[4816]: I0316 00:33:10.288888 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_smart-gateway-operator-bundle-2-build_4ac9b18d-8362-488c-a816-c85899c4aa6e/manage-dockerfile/0.log" Mar 16 00:33:11 crc kubenswrapper[4816]: I0316 00:33:11.240621 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/smart-gateway-operator-bundle-2-build" event={"ID":"4ac9b18d-8362-488c-a816-c85899c4aa6e","Type":"ContainerStarted","Data":"514ea21f1c2bd0b5e2d6897180f4ed6f68308cbb5552dadf3beb9f36f0ab1f92"} Mar 16 00:33:11 crc kubenswrapper[4816]: I0316 00:33:11.267982 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/smart-gateway-operator-bundle-2-build" podStartSLOduration=5.267966398 podStartE2EDuration="5.267966398s" podCreationTimestamp="2026-03-16 00:33:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-16 00:33:11.265622613 +0000 UTC m=+1584.361922566" watchObservedRunningTime="2026-03-16 00:33:11.267966398 +0000 UTC m=+1584.364266351" Mar 16 00:33:14 crc kubenswrapper[4816]: I0316 00:33:14.257907 4816 generic.go:334] "Generic (PLEG): container finished" podID="4ac9b18d-8362-488c-a816-c85899c4aa6e" containerID="514ea21f1c2bd0b5e2d6897180f4ed6f68308cbb5552dadf3beb9f36f0ab1f92" exitCode=0 Mar 16 00:33:14 crc kubenswrapper[4816]: I0316 00:33:14.258016 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/smart-gateway-operator-bundle-2-build" event={"ID":"4ac9b18d-8362-488c-a816-c85899c4aa6e","Type":"ContainerDied","Data":"514ea21f1c2bd0b5e2d6897180f4ed6f68308cbb5552dadf3beb9f36f0ab1f92"} Mar 16 00:33:15 crc kubenswrapper[4816]: I0316 00:33:15.557938 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/smart-gateway-operator-bundle-2-build" Mar 16 00:33:15 crc kubenswrapper[4816]: I0316 00:33:15.717867 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/4ac9b18d-8362-488c-a816-c85899c4aa6e-container-storage-run\") pod \"4ac9b18d-8362-488c-a816-c85899c4aa6e\" (UID: \"4ac9b18d-8362-488c-a816-c85899c4aa6e\") " Mar 16 00:33:15 crc kubenswrapper[4816]: I0316 00:33:15.717966 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-fs5z5-push\" (UniqueName: \"kubernetes.io/secret/4ac9b18d-8362-488c-a816-c85899c4aa6e-builder-dockercfg-fs5z5-push\") pod \"4ac9b18d-8362-488c-a816-c85899c4aa6e\" (UID: \"4ac9b18d-8362-488c-a816-c85899c4aa6e\") " Mar 16 00:33:15 crc kubenswrapper[4816]: I0316 00:33:15.718048 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/4ac9b18d-8362-488c-a816-c85899c4aa6e-build-ca-bundles\") pod \"4ac9b18d-8362-488c-a816-c85899c4aa6e\" (UID: \"4ac9b18d-8362-488c-a816-c85899c4aa6e\") " Mar 16 00:33:15 crc kubenswrapper[4816]: I0316 00:33:15.718075 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/4ac9b18d-8362-488c-a816-c85899c4aa6e-buildworkdir\") pod \"4ac9b18d-8362-488c-a816-c85899c4aa6e\" (UID: \"4ac9b18d-8362-488c-a816-c85899c4aa6e\") " Mar 16 00:33:15 crc kubenswrapper[4816]: I0316 00:33:15.718096 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/4ac9b18d-8362-488c-a816-c85899c4aa6e-build-blob-cache\") pod \"4ac9b18d-8362-488c-a816-c85899c4aa6e\" (UID: \"4ac9b18d-8362-488c-a816-c85899c4aa6e\") " Mar 16 00:33:15 crc kubenswrapper[4816]: I0316 00:33:15.718124 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/4ac9b18d-8362-488c-a816-c85899c4aa6e-container-storage-root\") pod \"4ac9b18d-8362-488c-a816-c85899c4aa6e\" (UID: \"4ac9b18d-8362-488c-a816-c85899c4aa6e\") " Mar 16 00:33:15 crc kubenswrapper[4816]: I0316 00:33:15.718155 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xjlvd\" (UniqueName: \"kubernetes.io/projected/4ac9b18d-8362-488c-a816-c85899c4aa6e-kube-api-access-xjlvd\") pod \"4ac9b18d-8362-488c-a816-c85899c4aa6e\" (UID: \"4ac9b18d-8362-488c-a816-c85899c4aa6e\") " Mar 16 00:33:15 crc kubenswrapper[4816]: I0316 00:33:15.718181 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/4ac9b18d-8362-488c-a816-c85899c4aa6e-build-proxy-ca-bundles\") pod \"4ac9b18d-8362-488c-a816-c85899c4aa6e\" (UID: \"4ac9b18d-8362-488c-a816-c85899c4aa6e\") " Mar 16 00:33:15 crc kubenswrapper[4816]: I0316 00:33:15.718211 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/4ac9b18d-8362-488c-a816-c85899c4aa6e-build-system-configs\") pod \"4ac9b18d-8362-488c-a816-c85899c4aa6e\" (UID: \"4ac9b18d-8362-488c-a816-c85899c4aa6e\") " Mar 16 00:33:15 crc kubenswrapper[4816]: I0316 00:33:15.718245 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-fs5z5-pull\" (UniqueName: \"kubernetes.io/secret/4ac9b18d-8362-488c-a816-c85899c4aa6e-builder-dockercfg-fs5z5-pull\") pod \"4ac9b18d-8362-488c-a816-c85899c4aa6e\" (UID: \"4ac9b18d-8362-488c-a816-c85899c4aa6e\") " Mar 16 00:33:15 crc kubenswrapper[4816]: I0316 00:33:15.718270 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/4ac9b18d-8362-488c-a816-c85899c4aa6e-node-pullsecrets\") pod \"4ac9b18d-8362-488c-a816-c85899c4aa6e\" (UID: \"4ac9b18d-8362-488c-a816-c85899c4aa6e\") " Mar 16 00:33:15 crc kubenswrapper[4816]: I0316 00:33:15.718289 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/4ac9b18d-8362-488c-a816-c85899c4aa6e-buildcachedir\") pod \"4ac9b18d-8362-488c-a816-c85899c4aa6e\" (UID: \"4ac9b18d-8362-488c-a816-c85899c4aa6e\") " Mar 16 00:33:15 crc kubenswrapper[4816]: I0316 00:33:15.718999 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4ac9b18d-8362-488c-a816-c85899c4aa6e-build-system-configs" (OuterVolumeSpecName: "build-system-configs") pod "4ac9b18d-8362-488c-a816-c85899c4aa6e" (UID: "4ac9b18d-8362-488c-a816-c85899c4aa6e"). InnerVolumeSpecName "build-system-configs". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:33:15 crc kubenswrapper[4816]: I0316 00:33:15.719071 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4ac9b18d-8362-488c-a816-c85899c4aa6e-container-storage-run" (OuterVolumeSpecName: "container-storage-run") pod "4ac9b18d-8362-488c-a816-c85899c4aa6e" (UID: "4ac9b18d-8362-488c-a816-c85899c4aa6e"). InnerVolumeSpecName "container-storage-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 16 00:33:15 crc kubenswrapper[4816]: I0316 00:33:15.719395 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4ac9b18d-8362-488c-a816-c85899c4aa6e-buildcachedir" (OuterVolumeSpecName: "buildcachedir") pod "4ac9b18d-8362-488c-a816-c85899c4aa6e" (UID: "4ac9b18d-8362-488c-a816-c85899c4aa6e"). InnerVolumeSpecName "buildcachedir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 16 00:33:15 crc kubenswrapper[4816]: I0316 00:33:15.719634 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4ac9b18d-8362-488c-a816-c85899c4aa6e-build-ca-bundles" (OuterVolumeSpecName: "build-ca-bundles") pod "4ac9b18d-8362-488c-a816-c85899c4aa6e" (UID: "4ac9b18d-8362-488c-a816-c85899c4aa6e"). InnerVolumeSpecName "build-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:33:15 crc kubenswrapper[4816]: I0316 00:33:15.718681 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4ac9b18d-8362-488c-a816-c85899c4aa6e-node-pullsecrets" (OuterVolumeSpecName: "node-pullsecrets") pod "4ac9b18d-8362-488c-a816-c85899c4aa6e" (UID: "4ac9b18d-8362-488c-a816-c85899c4aa6e"). InnerVolumeSpecName "node-pullsecrets". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 16 00:33:15 crc kubenswrapper[4816]: I0316 00:33:15.720038 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4ac9b18d-8362-488c-a816-c85899c4aa6e-build-proxy-ca-bundles" (OuterVolumeSpecName: "build-proxy-ca-bundles") pod "4ac9b18d-8362-488c-a816-c85899c4aa6e" (UID: "4ac9b18d-8362-488c-a816-c85899c4aa6e"). InnerVolumeSpecName "build-proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:33:15 crc kubenswrapper[4816]: I0316 00:33:15.719704 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4ac9b18d-8362-488c-a816-c85899c4aa6e-buildworkdir" (OuterVolumeSpecName: "buildworkdir") pod "4ac9b18d-8362-488c-a816-c85899c4aa6e" (UID: "4ac9b18d-8362-488c-a816-c85899c4aa6e"). InnerVolumeSpecName "buildworkdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 16 00:33:15 crc kubenswrapper[4816]: I0316 00:33:15.721590 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4ac9b18d-8362-488c-a816-c85899c4aa6e-build-blob-cache" (OuterVolumeSpecName: "build-blob-cache") pod "4ac9b18d-8362-488c-a816-c85899c4aa6e" (UID: "4ac9b18d-8362-488c-a816-c85899c4aa6e"). InnerVolumeSpecName "build-blob-cache". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 16 00:33:15 crc kubenswrapper[4816]: I0316 00:33:15.724896 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4ac9b18d-8362-488c-a816-c85899c4aa6e-builder-dockercfg-fs5z5-push" (OuterVolumeSpecName: "builder-dockercfg-fs5z5-push") pod "4ac9b18d-8362-488c-a816-c85899c4aa6e" (UID: "4ac9b18d-8362-488c-a816-c85899c4aa6e"). InnerVolumeSpecName "builder-dockercfg-fs5z5-push". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 16 00:33:15 crc kubenswrapper[4816]: I0316 00:33:15.725486 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4ac9b18d-8362-488c-a816-c85899c4aa6e-kube-api-access-xjlvd" (OuterVolumeSpecName: "kube-api-access-xjlvd") pod "4ac9b18d-8362-488c-a816-c85899c4aa6e" (UID: "4ac9b18d-8362-488c-a816-c85899c4aa6e"). InnerVolumeSpecName "kube-api-access-xjlvd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:33:15 crc kubenswrapper[4816]: I0316 00:33:15.726161 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4ac9b18d-8362-488c-a816-c85899c4aa6e-container-storage-root" (OuterVolumeSpecName: "container-storage-root") pod "4ac9b18d-8362-488c-a816-c85899c4aa6e" (UID: "4ac9b18d-8362-488c-a816-c85899c4aa6e"). InnerVolumeSpecName "container-storage-root". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 16 00:33:15 crc kubenswrapper[4816]: I0316 00:33:15.729741 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4ac9b18d-8362-488c-a816-c85899c4aa6e-builder-dockercfg-fs5z5-pull" (OuterVolumeSpecName: "builder-dockercfg-fs5z5-pull") pod "4ac9b18d-8362-488c-a816-c85899c4aa6e" (UID: "4ac9b18d-8362-488c-a816-c85899c4aa6e"). InnerVolumeSpecName "builder-dockercfg-fs5z5-pull". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 16 00:33:15 crc kubenswrapper[4816]: I0316 00:33:15.820073 4816 reconciler_common.go:293] "Volume detached for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/4ac9b18d-8362-488c-a816-c85899c4aa6e-build-system-configs\") on node \"crc\" DevicePath \"\"" Mar 16 00:33:15 crc kubenswrapper[4816]: I0316 00:33:15.820123 4816 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-fs5z5-pull\" (UniqueName: \"kubernetes.io/secret/4ac9b18d-8362-488c-a816-c85899c4aa6e-builder-dockercfg-fs5z5-pull\") on node \"crc\" DevicePath \"\"" Mar 16 00:33:15 crc kubenswrapper[4816]: I0316 00:33:15.820137 4816 reconciler_common.go:293] "Volume detached for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/4ac9b18d-8362-488c-a816-c85899c4aa6e-node-pullsecrets\") on node \"crc\" DevicePath \"\"" Mar 16 00:33:15 crc kubenswrapper[4816]: I0316 00:33:15.820148 4816 reconciler_common.go:293] "Volume detached for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/4ac9b18d-8362-488c-a816-c85899c4aa6e-buildcachedir\") on node \"crc\" DevicePath \"\"" Mar 16 00:33:15 crc kubenswrapper[4816]: I0316 00:33:15.820159 4816 reconciler_common.go:293] "Volume detached for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/4ac9b18d-8362-488c-a816-c85899c4aa6e-container-storage-run\") on node \"crc\" DevicePath \"\"" Mar 16 00:33:15 crc kubenswrapper[4816]: I0316 00:33:15.820169 4816 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-fs5z5-push\" (UniqueName: \"kubernetes.io/secret/4ac9b18d-8362-488c-a816-c85899c4aa6e-builder-dockercfg-fs5z5-push\") on node \"crc\" DevicePath \"\"" Mar 16 00:33:15 crc kubenswrapper[4816]: I0316 00:33:15.820181 4816 reconciler_common.go:293] "Volume detached for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/4ac9b18d-8362-488c-a816-c85899c4aa6e-build-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 16 00:33:15 crc kubenswrapper[4816]: I0316 00:33:15.820190 4816 reconciler_common.go:293] "Volume detached for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/4ac9b18d-8362-488c-a816-c85899c4aa6e-buildworkdir\") on node \"crc\" DevicePath \"\"" Mar 16 00:33:15 crc kubenswrapper[4816]: I0316 00:33:15.820201 4816 reconciler_common.go:293] "Volume detached for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/4ac9b18d-8362-488c-a816-c85899c4aa6e-build-blob-cache\") on node \"crc\" DevicePath \"\"" Mar 16 00:33:15 crc kubenswrapper[4816]: I0316 00:33:15.820210 4816 reconciler_common.go:293] "Volume detached for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/4ac9b18d-8362-488c-a816-c85899c4aa6e-container-storage-root\") on node \"crc\" DevicePath \"\"" Mar 16 00:33:15 crc kubenswrapper[4816]: I0316 00:33:15.820222 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xjlvd\" (UniqueName: \"kubernetes.io/projected/4ac9b18d-8362-488c-a816-c85899c4aa6e-kube-api-access-xjlvd\") on node \"crc\" DevicePath \"\"" Mar 16 00:33:15 crc kubenswrapper[4816]: I0316 00:33:15.820233 4816 reconciler_common.go:293] "Volume detached for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/4ac9b18d-8362-488c-a816-c85899c4aa6e-build-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 16 00:33:16 crc kubenswrapper[4816]: I0316 00:33:16.273573 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/smart-gateway-operator-bundle-2-build" event={"ID":"4ac9b18d-8362-488c-a816-c85899c4aa6e","Type":"ContainerDied","Data":"b6c00ab1c06d051f1145bfff23f81747ff821693325bc2d3f63912c28c948e1c"} Mar 16 00:33:16 crc kubenswrapper[4816]: I0316 00:33:16.273611 4816 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b6c00ab1c06d051f1145bfff23f81747ff821693325bc2d3f63912c28c948e1c" Mar 16 00:33:16 crc kubenswrapper[4816]: I0316 00:33:16.273657 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/smart-gateway-operator-bundle-2-build" Mar 16 00:33:32 crc kubenswrapper[4816]: I0316 00:33:32.200291 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/service-telemetry-framework-index-1-build"] Mar 16 00:33:32 crc kubenswrapper[4816]: E0316 00:33:32.200984 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4ac9b18d-8362-488c-a816-c85899c4aa6e" containerName="git-clone" Mar 16 00:33:32 crc kubenswrapper[4816]: I0316 00:33:32.200998 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="4ac9b18d-8362-488c-a816-c85899c4aa6e" containerName="git-clone" Mar 16 00:33:32 crc kubenswrapper[4816]: E0316 00:33:32.201008 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4ac9b18d-8362-488c-a816-c85899c4aa6e" containerName="docker-build" Mar 16 00:33:32 crc kubenswrapper[4816]: I0316 00:33:32.201014 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="4ac9b18d-8362-488c-a816-c85899c4aa6e" containerName="docker-build" Mar 16 00:33:32 crc kubenswrapper[4816]: E0316 00:33:32.201024 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4ac9b18d-8362-488c-a816-c85899c4aa6e" containerName="manage-dockerfile" Mar 16 00:33:32 crc kubenswrapper[4816]: I0316 00:33:32.201030 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="4ac9b18d-8362-488c-a816-c85899c4aa6e" containerName="manage-dockerfile" Mar 16 00:33:32 crc kubenswrapper[4816]: I0316 00:33:32.201145 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="4ac9b18d-8362-488c-a816-c85899c4aa6e" containerName="docker-build" Mar 16 00:33:32 crc kubenswrapper[4816]: I0316 00:33:32.201919 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-framework-index-1-build" Mar 16 00:33:32 crc kubenswrapper[4816]: I0316 00:33:32.205310 4816 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"service-telemetry-framework-index-dockercfg" Mar 16 00:33:32 crc kubenswrapper[4816]: I0316 00:33:32.206589 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"service-telemetry-framework-index-1-ca" Mar 16 00:33:32 crc kubenswrapper[4816]: I0316 00:33:32.206804 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"service-telemetry-framework-index-1-global-ca" Mar 16 00:33:32 crc kubenswrapper[4816]: I0316 00:33:32.207030 4816 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"builder-dockercfg-fs5z5" Mar 16 00:33:32 crc kubenswrapper[4816]: I0316 00:33:32.207153 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"service-telemetry-framework-index-1-sys-config" Mar 16 00:33:32 crc kubenswrapper[4816]: I0316 00:33:32.227116 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/service-telemetry-framework-index-1-build"] Mar 16 00:33:32 crc kubenswrapper[4816]: I0316 00:33:32.325834 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/ff71ddfd-c6da-40e7-ac26-e7178e364679-container-storage-root\") pod \"service-telemetry-framework-index-1-build\" (UID: \"ff71ddfd-c6da-40e7-ac26-e7178e364679\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Mar 16 00:33:32 crc kubenswrapper[4816]: I0316 00:33:32.325992 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/ff71ddfd-c6da-40e7-ac26-e7178e364679-buildcachedir\") pod \"service-telemetry-framework-index-1-build\" (UID: \"ff71ddfd-c6da-40e7-ac26-e7178e364679\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Mar 16 00:33:32 crc kubenswrapper[4816]: I0316 00:33:32.326031 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/ff71ddfd-c6da-40e7-ac26-e7178e364679-build-proxy-ca-bundles\") pod \"service-telemetry-framework-index-1-build\" (UID: \"ff71ddfd-c6da-40e7-ac26-e7178e364679\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Mar 16 00:33:32 crc kubenswrapper[4816]: I0316 00:33:32.326108 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/ff71ddfd-c6da-40e7-ac26-e7178e364679-build-system-configs\") pod \"service-telemetry-framework-index-1-build\" (UID: \"ff71ddfd-c6da-40e7-ac26-e7178e364679\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Mar 16 00:33:32 crc kubenswrapper[4816]: I0316 00:33:32.326132 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-fs5z5-pull\" (UniqueName: \"kubernetes.io/secret/ff71ddfd-c6da-40e7-ac26-e7178e364679-builder-dockercfg-fs5z5-pull\") pod \"service-telemetry-framework-index-1-build\" (UID: \"ff71ddfd-c6da-40e7-ac26-e7178e364679\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Mar 16 00:33:32 crc kubenswrapper[4816]: I0316 00:33:32.326153 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/ff71ddfd-c6da-40e7-ac26-e7178e364679-container-storage-run\") pod \"service-telemetry-framework-index-1-build\" (UID: \"ff71ddfd-c6da-40e7-ac26-e7178e364679\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Mar 16 00:33:32 crc kubenswrapper[4816]: I0316 00:33:32.326211 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-fs5z5-push\" (UniqueName: \"kubernetes.io/secret/ff71ddfd-c6da-40e7-ac26-e7178e364679-builder-dockercfg-fs5z5-push\") pod \"service-telemetry-framework-index-1-build\" (UID: \"ff71ddfd-c6da-40e7-ac26-e7178e364679\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Mar 16 00:33:32 crc kubenswrapper[4816]: I0316 00:33:32.326254 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/ff71ddfd-c6da-40e7-ac26-e7178e364679-build-blob-cache\") pod \"service-telemetry-framework-index-1-build\" (UID: \"ff71ddfd-c6da-40e7-ac26-e7178e364679\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Mar 16 00:33:32 crc kubenswrapper[4816]: I0316 00:33:32.326281 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/ff71ddfd-c6da-40e7-ac26-e7178e364679-buildworkdir\") pod \"service-telemetry-framework-index-1-build\" (UID: \"ff71ddfd-c6da-40e7-ac26-e7178e364679\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Mar 16 00:33:32 crc kubenswrapper[4816]: I0316 00:33:32.326314 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/ff71ddfd-c6da-40e7-ac26-e7178e364679-node-pullsecrets\") pod \"service-telemetry-framework-index-1-build\" (UID: \"ff71ddfd-c6da-40e7-ac26-e7178e364679\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Mar 16 00:33:32 crc kubenswrapper[4816]: I0316 00:33:32.326377 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/ff71ddfd-c6da-40e7-ac26-e7178e364679-build-ca-bundles\") pod \"service-telemetry-framework-index-1-build\" (UID: \"ff71ddfd-c6da-40e7-ac26-e7178e364679\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Mar 16 00:33:32 crc kubenswrapper[4816]: I0316 00:33:32.326446 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-telemetry-framework-index-dockercfg-user-build-volume\" (UniqueName: \"kubernetes.io/secret/ff71ddfd-c6da-40e7-ac26-e7178e364679-service-telemetry-framework-index-dockercfg-user-build-volume\") pod \"service-telemetry-framework-index-1-build\" (UID: \"ff71ddfd-c6da-40e7-ac26-e7178e364679\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Mar 16 00:33:32 crc kubenswrapper[4816]: I0316 00:33:32.326504 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9rhdb\" (UniqueName: \"kubernetes.io/projected/ff71ddfd-c6da-40e7-ac26-e7178e364679-kube-api-access-9rhdb\") pod \"service-telemetry-framework-index-1-build\" (UID: \"ff71ddfd-c6da-40e7-ac26-e7178e364679\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Mar 16 00:33:32 crc kubenswrapper[4816]: I0316 00:33:32.427837 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/ff71ddfd-c6da-40e7-ac26-e7178e364679-buildcachedir\") pod \"service-telemetry-framework-index-1-build\" (UID: \"ff71ddfd-c6da-40e7-ac26-e7178e364679\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Mar 16 00:33:32 crc kubenswrapper[4816]: I0316 00:33:32.427887 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/ff71ddfd-c6da-40e7-ac26-e7178e364679-build-proxy-ca-bundles\") pod \"service-telemetry-framework-index-1-build\" (UID: \"ff71ddfd-c6da-40e7-ac26-e7178e364679\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Mar 16 00:33:32 crc kubenswrapper[4816]: I0316 00:33:32.427928 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/ff71ddfd-c6da-40e7-ac26-e7178e364679-build-system-configs\") pod \"service-telemetry-framework-index-1-build\" (UID: \"ff71ddfd-c6da-40e7-ac26-e7178e364679\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Mar 16 00:33:32 crc kubenswrapper[4816]: I0316 00:33:32.427961 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-fs5z5-pull\" (UniqueName: \"kubernetes.io/secret/ff71ddfd-c6da-40e7-ac26-e7178e364679-builder-dockercfg-fs5z5-pull\") pod \"service-telemetry-framework-index-1-build\" (UID: \"ff71ddfd-c6da-40e7-ac26-e7178e364679\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Mar 16 00:33:32 crc kubenswrapper[4816]: I0316 00:33:32.427972 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/ff71ddfd-c6da-40e7-ac26-e7178e364679-buildcachedir\") pod \"service-telemetry-framework-index-1-build\" (UID: \"ff71ddfd-c6da-40e7-ac26-e7178e364679\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Mar 16 00:33:32 crc kubenswrapper[4816]: I0316 00:33:32.428008 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/ff71ddfd-c6da-40e7-ac26-e7178e364679-container-storage-run\") pod \"service-telemetry-framework-index-1-build\" (UID: \"ff71ddfd-c6da-40e7-ac26-e7178e364679\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Mar 16 00:33:32 crc kubenswrapper[4816]: I0316 00:33:32.428054 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-fs5z5-push\" (UniqueName: \"kubernetes.io/secret/ff71ddfd-c6da-40e7-ac26-e7178e364679-builder-dockercfg-fs5z5-push\") pod \"service-telemetry-framework-index-1-build\" (UID: \"ff71ddfd-c6da-40e7-ac26-e7178e364679\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Mar 16 00:33:32 crc kubenswrapper[4816]: I0316 00:33:32.428090 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/ff71ddfd-c6da-40e7-ac26-e7178e364679-build-blob-cache\") pod \"service-telemetry-framework-index-1-build\" (UID: \"ff71ddfd-c6da-40e7-ac26-e7178e364679\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Mar 16 00:33:32 crc kubenswrapper[4816]: I0316 00:33:32.428126 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/ff71ddfd-c6da-40e7-ac26-e7178e364679-buildworkdir\") pod \"service-telemetry-framework-index-1-build\" (UID: \"ff71ddfd-c6da-40e7-ac26-e7178e364679\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Mar 16 00:33:32 crc kubenswrapper[4816]: I0316 00:33:32.428167 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/ff71ddfd-c6da-40e7-ac26-e7178e364679-node-pullsecrets\") pod \"service-telemetry-framework-index-1-build\" (UID: \"ff71ddfd-c6da-40e7-ac26-e7178e364679\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Mar 16 00:33:32 crc kubenswrapper[4816]: I0316 00:33:32.428196 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/ff71ddfd-c6da-40e7-ac26-e7178e364679-build-ca-bundles\") pod \"service-telemetry-framework-index-1-build\" (UID: \"ff71ddfd-c6da-40e7-ac26-e7178e364679\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Mar 16 00:33:32 crc kubenswrapper[4816]: I0316 00:33:32.428255 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-telemetry-framework-index-dockercfg-user-build-volume\" (UniqueName: \"kubernetes.io/secret/ff71ddfd-c6da-40e7-ac26-e7178e364679-service-telemetry-framework-index-dockercfg-user-build-volume\") pod \"service-telemetry-framework-index-1-build\" (UID: \"ff71ddfd-c6da-40e7-ac26-e7178e364679\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Mar 16 00:33:32 crc kubenswrapper[4816]: I0316 00:33:32.428306 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9rhdb\" (UniqueName: \"kubernetes.io/projected/ff71ddfd-c6da-40e7-ac26-e7178e364679-kube-api-access-9rhdb\") pod \"service-telemetry-framework-index-1-build\" (UID: \"ff71ddfd-c6da-40e7-ac26-e7178e364679\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Mar 16 00:33:32 crc kubenswrapper[4816]: I0316 00:33:32.428350 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/ff71ddfd-c6da-40e7-ac26-e7178e364679-container-storage-root\") pod \"service-telemetry-framework-index-1-build\" (UID: \"ff71ddfd-c6da-40e7-ac26-e7178e364679\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Mar 16 00:33:32 crc kubenswrapper[4816]: I0316 00:33:32.428698 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/ff71ddfd-c6da-40e7-ac26-e7178e364679-container-storage-run\") pod \"service-telemetry-framework-index-1-build\" (UID: \"ff71ddfd-c6da-40e7-ac26-e7178e364679\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Mar 16 00:33:32 crc kubenswrapper[4816]: I0316 00:33:32.428827 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/ff71ddfd-c6da-40e7-ac26-e7178e364679-build-blob-cache\") pod \"service-telemetry-framework-index-1-build\" (UID: \"ff71ddfd-c6da-40e7-ac26-e7178e364679\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Mar 16 00:33:32 crc kubenswrapper[4816]: I0316 00:33:32.428848 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/ff71ddfd-c6da-40e7-ac26-e7178e364679-build-system-configs\") pod \"service-telemetry-framework-index-1-build\" (UID: \"ff71ddfd-c6da-40e7-ac26-e7178e364679\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Mar 16 00:33:32 crc kubenswrapper[4816]: I0316 00:33:32.428917 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/ff71ddfd-c6da-40e7-ac26-e7178e364679-build-proxy-ca-bundles\") pod \"service-telemetry-framework-index-1-build\" (UID: \"ff71ddfd-c6da-40e7-ac26-e7178e364679\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Mar 16 00:33:32 crc kubenswrapper[4816]: I0316 00:33:32.428990 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/ff71ddfd-c6da-40e7-ac26-e7178e364679-node-pullsecrets\") pod \"service-telemetry-framework-index-1-build\" (UID: \"ff71ddfd-c6da-40e7-ac26-e7178e364679\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Mar 16 00:33:32 crc kubenswrapper[4816]: I0316 00:33:32.429114 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/ff71ddfd-c6da-40e7-ac26-e7178e364679-buildworkdir\") pod \"service-telemetry-framework-index-1-build\" (UID: \"ff71ddfd-c6da-40e7-ac26-e7178e364679\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Mar 16 00:33:32 crc kubenswrapper[4816]: I0316 00:33:32.429615 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/ff71ddfd-c6da-40e7-ac26-e7178e364679-build-ca-bundles\") pod \"service-telemetry-framework-index-1-build\" (UID: \"ff71ddfd-c6da-40e7-ac26-e7178e364679\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Mar 16 00:33:32 crc kubenswrapper[4816]: I0316 00:33:32.429652 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/ff71ddfd-c6da-40e7-ac26-e7178e364679-container-storage-root\") pod \"service-telemetry-framework-index-1-build\" (UID: \"ff71ddfd-c6da-40e7-ac26-e7178e364679\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Mar 16 00:33:32 crc kubenswrapper[4816]: I0316 00:33:32.433633 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-fs5z5-pull\" (UniqueName: \"kubernetes.io/secret/ff71ddfd-c6da-40e7-ac26-e7178e364679-builder-dockercfg-fs5z5-pull\") pod \"service-telemetry-framework-index-1-build\" (UID: \"ff71ddfd-c6da-40e7-ac26-e7178e364679\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Mar 16 00:33:32 crc kubenswrapper[4816]: I0316 00:33:32.434225 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-telemetry-framework-index-dockercfg-user-build-volume\" (UniqueName: \"kubernetes.io/secret/ff71ddfd-c6da-40e7-ac26-e7178e364679-service-telemetry-framework-index-dockercfg-user-build-volume\") pod \"service-telemetry-framework-index-1-build\" (UID: \"ff71ddfd-c6da-40e7-ac26-e7178e364679\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Mar 16 00:33:32 crc kubenswrapper[4816]: I0316 00:33:32.434420 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-fs5z5-push\" (UniqueName: \"kubernetes.io/secret/ff71ddfd-c6da-40e7-ac26-e7178e364679-builder-dockercfg-fs5z5-push\") pod \"service-telemetry-framework-index-1-build\" (UID: \"ff71ddfd-c6da-40e7-ac26-e7178e364679\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Mar 16 00:33:32 crc kubenswrapper[4816]: I0316 00:33:32.446241 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9rhdb\" (UniqueName: \"kubernetes.io/projected/ff71ddfd-c6da-40e7-ac26-e7178e364679-kube-api-access-9rhdb\") pod \"service-telemetry-framework-index-1-build\" (UID: \"ff71ddfd-c6da-40e7-ac26-e7178e364679\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Mar 16 00:33:32 crc kubenswrapper[4816]: I0316 00:33:32.531097 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-framework-index-1-build" Mar 16 00:33:32 crc kubenswrapper[4816]: I0316 00:33:32.722539 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/service-telemetry-framework-index-1-build"] Mar 16 00:33:33 crc kubenswrapper[4816]: I0316 00:33:33.396001 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-framework-index-1-build" event={"ID":"ff71ddfd-c6da-40e7-ac26-e7178e364679","Type":"ContainerStarted","Data":"4ba83f4162fdd45ac10eb6399a78b3388e8086607eb25657ce3203eaaf7bff54"} Mar 16 00:33:33 crc kubenswrapper[4816]: I0316 00:33:33.396311 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-framework-index-1-build" event={"ID":"ff71ddfd-c6da-40e7-ac26-e7178e364679","Type":"ContainerStarted","Data":"2c2104821392a6c5290ae6c58a4c84c015279512137a840ee8488010d57afbe2"} Mar 16 00:33:34 crc kubenswrapper[4816]: I0316 00:33:34.407900 4816 generic.go:334] "Generic (PLEG): container finished" podID="ff71ddfd-c6da-40e7-ac26-e7178e364679" containerID="4ba83f4162fdd45ac10eb6399a78b3388e8086607eb25657ce3203eaaf7bff54" exitCode=0 Mar 16 00:33:34 crc kubenswrapper[4816]: I0316 00:33:34.408042 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-framework-index-1-build" event={"ID":"ff71ddfd-c6da-40e7-ac26-e7178e364679","Type":"ContainerDied","Data":"4ba83f4162fdd45ac10eb6399a78b3388e8086607eb25657ce3203eaaf7bff54"} Mar 16 00:33:35 crc kubenswrapper[4816]: I0316 00:33:35.415463 4816 generic.go:334] "Generic (PLEG): container finished" podID="ff71ddfd-c6da-40e7-ac26-e7178e364679" containerID="c3ca84acff081714bca19f133df479a0c9f1aeb453b6556bb680a18c786e81f9" exitCode=0 Mar 16 00:33:35 crc kubenswrapper[4816]: I0316 00:33:35.415520 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-framework-index-1-build" event={"ID":"ff71ddfd-c6da-40e7-ac26-e7178e364679","Type":"ContainerDied","Data":"c3ca84acff081714bca19f133df479a0c9f1aeb453b6556bb680a18c786e81f9"} Mar 16 00:33:35 crc kubenswrapper[4816]: I0316 00:33:35.451416 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_service-telemetry-framework-index-1-build_ff71ddfd-c6da-40e7-ac26-e7178e364679/manage-dockerfile/0.log" Mar 16 00:33:36 crc kubenswrapper[4816]: I0316 00:33:36.424786 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-framework-index-1-build" event={"ID":"ff71ddfd-c6da-40e7-ac26-e7178e364679","Type":"ContainerStarted","Data":"8b7a0d2de6d39ce58c695e7862c0bf7b6723767d296869ab9afafe7273667037"} Mar 16 00:33:36 crc kubenswrapper[4816]: I0316 00:33:36.452571 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/service-telemetry-framework-index-1-build" podStartSLOduration=4.452530745 podStartE2EDuration="4.452530745s" podCreationTimestamp="2026-03-16 00:33:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-16 00:33:36.449059229 +0000 UTC m=+1609.545359192" watchObservedRunningTime="2026-03-16 00:33:36.452530745 +0000 UTC m=+1609.548830718" Mar 16 00:34:00 crc kubenswrapper[4816]: I0316 00:34:00.136999 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29560354-nmflm"] Mar 16 00:34:00 crc kubenswrapper[4816]: I0316 00:34:00.138417 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29560354-nmflm" Mar 16 00:34:00 crc kubenswrapper[4816]: I0316 00:34:00.140443 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-8hc2r" Mar 16 00:34:00 crc kubenswrapper[4816]: I0316 00:34:00.141884 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 16 00:34:00 crc kubenswrapper[4816]: I0316 00:34:00.144320 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 16 00:34:00 crc kubenswrapper[4816]: I0316 00:34:00.149435 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29560354-nmflm"] Mar 16 00:34:00 crc kubenswrapper[4816]: I0316 00:34:00.206523 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c5ct9\" (UniqueName: \"kubernetes.io/projected/4786aa78-4870-43d7-a324-e3e3dd2c7943-kube-api-access-c5ct9\") pod \"auto-csr-approver-29560354-nmflm\" (UID: \"4786aa78-4870-43d7-a324-e3e3dd2c7943\") " pod="openshift-infra/auto-csr-approver-29560354-nmflm" Mar 16 00:34:00 crc kubenswrapper[4816]: I0316 00:34:00.307532 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c5ct9\" (UniqueName: \"kubernetes.io/projected/4786aa78-4870-43d7-a324-e3e3dd2c7943-kube-api-access-c5ct9\") pod \"auto-csr-approver-29560354-nmflm\" (UID: \"4786aa78-4870-43d7-a324-e3e3dd2c7943\") " pod="openshift-infra/auto-csr-approver-29560354-nmflm" Mar 16 00:34:00 crc kubenswrapper[4816]: I0316 00:34:00.329900 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c5ct9\" (UniqueName: \"kubernetes.io/projected/4786aa78-4870-43d7-a324-e3e3dd2c7943-kube-api-access-c5ct9\") pod \"auto-csr-approver-29560354-nmflm\" (UID: \"4786aa78-4870-43d7-a324-e3e3dd2c7943\") " pod="openshift-infra/auto-csr-approver-29560354-nmflm" Mar 16 00:34:00 crc kubenswrapper[4816]: I0316 00:34:00.484475 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29560354-nmflm" Mar 16 00:34:00 crc kubenswrapper[4816]: I0316 00:34:00.900239 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29560354-nmflm"] Mar 16 00:34:00 crc kubenswrapper[4816]: I0316 00:34:00.911095 4816 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 16 00:34:01 crc kubenswrapper[4816]: I0316 00:34:01.613335 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29560354-nmflm" event={"ID":"4786aa78-4870-43d7-a324-e3e3dd2c7943","Type":"ContainerStarted","Data":"86207657188b34b5bd103fcfbc19aacf027614812a3717aef241d2ae2885907a"} Mar 16 00:34:06 crc kubenswrapper[4816]: I0316 00:34:06.648294 4816 generic.go:334] "Generic (PLEG): container finished" podID="ff71ddfd-c6da-40e7-ac26-e7178e364679" containerID="8b7a0d2de6d39ce58c695e7862c0bf7b6723767d296869ab9afafe7273667037" exitCode=0 Mar 16 00:34:06 crc kubenswrapper[4816]: I0316 00:34:06.648394 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-framework-index-1-build" event={"ID":"ff71ddfd-c6da-40e7-ac26-e7178e364679","Type":"ContainerDied","Data":"8b7a0d2de6d39ce58c695e7862c0bf7b6723767d296869ab9afafe7273667037"} Mar 16 00:34:06 crc kubenswrapper[4816]: I0316 00:34:06.651724 4816 generic.go:334] "Generic (PLEG): container finished" podID="4786aa78-4870-43d7-a324-e3e3dd2c7943" containerID="0c26c66eab197680871c2539e7ed1477694cb8e32e0bc0cdad1221a9720899f7" exitCode=0 Mar 16 00:34:06 crc kubenswrapper[4816]: I0316 00:34:06.651789 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29560354-nmflm" event={"ID":"4786aa78-4870-43d7-a324-e3e3dd2c7943","Type":"ContainerDied","Data":"0c26c66eab197680871c2539e7ed1477694cb8e32e0bc0cdad1221a9720899f7"} Mar 16 00:34:07 crc kubenswrapper[4816]: I0316 00:34:07.950998 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29560354-nmflm" Mar 16 00:34:07 crc kubenswrapper[4816]: I0316 00:34:07.959661 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-framework-index-1-build" Mar 16 00:34:08 crc kubenswrapper[4816]: I0316 00:34:08.114151 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9rhdb\" (UniqueName: \"kubernetes.io/projected/ff71ddfd-c6da-40e7-ac26-e7178e364679-kube-api-access-9rhdb\") pod \"ff71ddfd-c6da-40e7-ac26-e7178e364679\" (UID: \"ff71ddfd-c6da-40e7-ac26-e7178e364679\") " Mar 16 00:34:08 crc kubenswrapper[4816]: I0316 00:34:08.114790 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/ff71ddfd-c6da-40e7-ac26-e7178e364679-build-ca-bundles\") pod \"ff71ddfd-c6da-40e7-ac26-e7178e364679\" (UID: \"ff71ddfd-c6da-40e7-ac26-e7178e364679\") " Mar 16 00:34:08 crc kubenswrapper[4816]: I0316 00:34:08.114849 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/ff71ddfd-c6da-40e7-ac26-e7178e364679-buildcachedir\") pod \"ff71ddfd-c6da-40e7-ac26-e7178e364679\" (UID: \"ff71ddfd-c6da-40e7-ac26-e7178e364679\") " Mar 16 00:34:08 crc kubenswrapper[4816]: I0316 00:34:08.114889 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/ff71ddfd-c6da-40e7-ac26-e7178e364679-build-blob-cache\") pod \"ff71ddfd-c6da-40e7-ac26-e7178e364679\" (UID: \"ff71ddfd-c6da-40e7-ac26-e7178e364679\") " Mar 16 00:34:08 crc kubenswrapper[4816]: I0316 00:34:08.114923 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/ff71ddfd-c6da-40e7-ac26-e7178e364679-buildworkdir\") pod \"ff71ddfd-c6da-40e7-ac26-e7178e364679\" (UID: \"ff71ddfd-c6da-40e7-ac26-e7178e364679\") " Mar 16 00:34:08 crc kubenswrapper[4816]: I0316 00:34:08.114947 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/ff71ddfd-c6da-40e7-ac26-e7178e364679-build-system-configs\") pod \"ff71ddfd-c6da-40e7-ac26-e7178e364679\" (UID: \"ff71ddfd-c6da-40e7-ac26-e7178e364679\") " Mar 16 00:34:08 crc kubenswrapper[4816]: I0316 00:34:08.114995 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/ff71ddfd-c6da-40e7-ac26-e7178e364679-container-storage-run\") pod \"ff71ddfd-c6da-40e7-ac26-e7178e364679\" (UID: \"ff71ddfd-c6da-40e7-ac26-e7178e364679\") " Mar 16 00:34:08 crc kubenswrapper[4816]: I0316 00:34:08.115035 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-fs5z5-push\" (UniqueName: \"kubernetes.io/secret/ff71ddfd-c6da-40e7-ac26-e7178e364679-builder-dockercfg-fs5z5-push\") pod \"ff71ddfd-c6da-40e7-ac26-e7178e364679\" (UID: \"ff71ddfd-c6da-40e7-ac26-e7178e364679\") " Mar 16 00:34:08 crc kubenswrapper[4816]: I0316 00:34:08.115059 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/ff71ddfd-c6da-40e7-ac26-e7178e364679-build-proxy-ca-bundles\") pod \"ff71ddfd-c6da-40e7-ac26-e7178e364679\" (UID: \"ff71ddfd-c6da-40e7-ac26-e7178e364679\") " Mar 16 00:34:08 crc kubenswrapper[4816]: I0316 00:34:08.115091 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-telemetry-framework-index-dockercfg-user-build-volume\" (UniqueName: \"kubernetes.io/secret/ff71ddfd-c6da-40e7-ac26-e7178e364679-service-telemetry-framework-index-dockercfg-user-build-volume\") pod \"ff71ddfd-c6da-40e7-ac26-e7178e364679\" (UID: \"ff71ddfd-c6da-40e7-ac26-e7178e364679\") " Mar 16 00:34:08 crc kubenswrapper[4816]: I0316 00:34:08.115140 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/ff71ddfd-c6da-40e7-ac26-e7178e364679-container-storage-root\") pod \"ff71ddfd-c6da-40e7-ac26-e7178e364679\" (UID: \"ff71ddfd-c6da-40e7-ac26-e7178e364679\") " Mar 16 00:34:08 crc kubenswrapper[4816]: I0316 00:34:08.115185 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-fs5z5-pull\" (UniqueName: \"kubernetes.io/secret/ff71ddfd-c6da-40e7-ac26-e7178e364679-builder-dockercfg-fs5z5-pull\") pod \"ff71ddfd-c6da-40e7-ac26-e7178e364679\" (UID: \"ff71ddfd-c6da-40e7-ac26-e7178e364679\") " Mar 16 00:34:08 crc kubenswrapper[4816]: I0316 00:34:08.115213 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/ff71ddfd-c6da-40e7-ac26-e7178e364679-node-pullsecrets\") pod \"ff71ddfd-c6da-40e7-ac26-e7178e364679\" (UID: \"ff71ddfd-c6da-40e7-ac26-e7178e364679\") " Mar 16 00:34:08 crc kubenswrapper[4816]: I0316 00:34:08.115240 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c5ct9\" (UniqueName: \"kubernetes.io/projected/4786aa78-4870-43d7-a324-e3e3dd2c7943-kube-api-access-c5ct9\") pod \"4786aa78-4870-43d7-a324-e3e3dd2c7943\" (UID: \"4786aa78-4870-43d7-a324-e3e3dd2c7943\") " Mar 16 00:34:08 crc kubenswrapper[4816]: I0316 00:34:08.115628 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ff71ddfd-c6da-40e7-ac26-e7178e364679-build-ca-bundles" (OuterVolumeSpecName: "build-ca-bundles") pod "ff71ddfd-c6da-40e7-ac26-e7178e364679" (UID: "ff71ddfd-c6da-40e7-ac26-e7178e364679"). InnerVolumeSpecName "build-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:34:08 crc kubenswrapper[4816]: I0316 00:34:08.115995 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ff71ddfd-c6da-40e7-ac26-e7178e364679-buildcachedir" (OuterVolumeSpecName: "buildcachedir") pod "ff71ddfd-c6da-40e7-ac26-e7178e364679" (UID: "ff71ddfd-c6da-40e7-ac26-e7178e364679"). InnerVolumeSpecName "buildcachedir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 16 00:34:08 crc kubenswrapper[4816]: I0316 00:34:08.116119 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ff71ddfd-c6da-40e7-ac26-e7178e364679-node-pullsecrets" (OuterVolumeSpecName: "node-pullsecrets") pod "ff71ddfd-c6da-40e7-ac26-e7178e364679" (UID: "ff71ddfd-c6da-40e7-ac26-e7178e364679"). InnerVolumeSpecName "node-pullsecrets". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 16 00:34:08 crc kubenswrapper[4816]: I0316 00:34:08.116353 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ff71ddfd-c6da-40e7-ac26-e7178e364679-buildworkdir" (OuterVolumeSpecName: "buildworkdir") pod "ff71ddfd-c6da-40e7-ac26-e7178e364679" (UID: "ff71ddfd-c6da-40e7-ac26-e7178e364679"). InnerVolumeSpecName "buildworkdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 16 00:34:08 crc kubenswrapper[4816]: I0316 00:34:08.116405 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ff71ddfd-c6da-40e7-ac26-e7178e364679-build-proxy-ca-bundles" (OuterVolumeSpecName: "build-proxy-ca-bundles") pod "ff71ddfd-c6da-40e7-ac26-e7178e364679" (UID: "ff71ddfd-c6da-40e7-ac26-e7178e364679"). InnerVolumeSpecName "build-proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:34:08 crc kubenswrapper[4816]: I0316 00:34:08.116711 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ff71ddfd-c6da-40e7-ac26-e7178e364679-build-system-configs" (OuterVolumeSpecName: "build-system-configs") pod "ff71ddfd-c6da-40e7-ac26-e7178e364679" (UID: "ff71ddfd-c6da-40e7-ac26-e7178e364679"). InnerVolumeSpecName "build-system-configs". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:34:08 crc kubenswrapper[4816]: I0316 00:34:08.117296 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ff71ddfd-c6da-40e7-ac26-e7178e364679-container-storage-run" (OuterVolumeSpecName: "container-storage-run") pod "ff71ddfd-c6da-40e7-ac26-e7178e364679" (UID: "ff71ddfd-c6da-40e7-ac26-e7178e364679"). InnerVolumeSpecName "container-storage-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 16 00:34:08 crc kubenswrapper[4816]: I0316 00:34:08.119671 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ff71ddfd-c6da-40e7-ac26-e7178e364679-builder-dockercfg-fs5z5-pull" (OuterVolumeSpecName: "builder-dockercfg-fs5z5-pull") pod "ff71ddfd-c6da-40e7-ac26-e7178e364679" (UID: "ff71ddfd-c6da-40e7-ac26-e7178e364679"). InnerVolumeSpecName "builder-dockercfg-fs5z5-pull". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 16 00:34:08 crc kubenswrapper[4816]: I0316 00:34:08.119800 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ff71ddfd-c6da-40e7-ac26-e7178e364679-service-telemetry-framework-index-dockercfg-user-build-volume" (OuterVolumeSpecName: "service-telemetry-framework-index-dockercfg-user-build-volume") pod "ff71ddfd-c6da-40e7-ac26-e7178e364679" (UID: "ff71ddfd-c6da-40e7-ac26-e7178e364679"). InnerVolumeSpecName "service-telemetry-framework-index-dockercfg-user-build-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 16 00:34:08 crc kubenswrapper[4816]: I0316 00:34:08.120437 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ff71ddfd-c6da-40e7-ac26-e7178e364679-builder-dockercfg-fs5z5-push" (OuterVolumeSpecName: "builder-dockercfg-fs5z5-push") pod "ff71ddfd-c6da-40e7-ac26-e7178e364679" (UID: "ff71ddfd-c6da-40e7-ac26-e7178e364679"). InnerVolumeSpecName "builder-dockercfg-fs5z5-push". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 16 00:34:08 crc kubenswrapper[4816]: I0316 00:34:08.120711 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ff71ddfd-c6da-40e7-ac26-e7178e364679-kube-api-access-9rhdb" (OuterVolumeSpecName: "kube-api-access-9rhdb") pod "ff71ddfd-c6da-40e7-ac26-e7178e364679" (UID: "ff71ddfd-c6da-40e7-ac26-e7178e364679"). InnerVolumeSpecName "kube-api-access-9rhdb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:34:08 crc kubenswrapper[4816]: I0316 00:34:08.121299 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4786aa78-4870-43d7-a324-e3e3dd2c7943-kube-api-access-c5ct9" (OuterVolumeSpecName: "kube-api-access-c5ct9") pod "4786aa78-4870-43d7-a324-e3e3dd2c7943" (UID: "4786aa78-4870-43d7-a324-e3e3dd2c7943"). InnerVolumeSpecName "kube-api-access-c5ct9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:34:08 crc kubenswrapper[4816]: I0316 00:34:08.217346 4816 reconciler_common.go:293] "Volume detached for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/ff71ddfd-c6da-40e7-ac26-e7178e364679-build-system-configs\") on node \"crc\" DevicePath \"\"" Mar 16 00:34:08 crc kubenswrapper[4816]: I0316 00:34:08.217391 4816 reconciler_common.go:293] "Volume detached for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/ff71ddfd-c6da-40e7-ac26-e7178e364679-buildworkdir\") on node \"crc\" DevicePath \"\"" Mar 16 00:34:08 crc kubenswrapper[4816]: I0316 00:34:08.217408 4816 reconciler_common.go:293] "Volume detached for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/ff71ddfd-c6da-40e7-ac26-e7178e364679-container-storage-run\") on node \"crc\" DevicePath \"\"" Mar 16 00:34:08 crc kubenswrapper[4816]: I0316 00:34:08.217420 4816 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-fs5z5-push\" (UniqueName: \"kubernetes.io/secret/ff71ddfd-c6da-40e7-ac26-e7178e364679-builder-dockercfg-fs5z5-push\") on node \"crc\" DevicePath \"\"" Mar 16 00:34:08 crc kubenswrapper[4816]: I0316 00:34:08.217432 4816 reconciler_common.go:293] "Volume detached for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/ff71ddfd-c6da-40e7-ac26-e7178e364679-build-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 16 00:34:08 crc kubenswrapper[4816]: I0316 00:34:08.217445 4816 reconciler_common.go:293] "Volume detached for volume \"service-telemetry-framework-index-dockercfg-user-build-volume\" (UniqueName: \"kubernetes.io/secret/ff71ddfd-c6da-40e7-ac26-e7178e364679-service-telemetry-framework-index-dockercfg-user-build-volume\") on node \"crc\" DevicePath \"\"" Mar 16 00:34:08 crc kubenswrapper[4816]: I0316 00:34:08.217460 4816 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-fs5z5-pull\" (UniqueName: \"kubernetes.io/secret/ff71ddfd-c6da-40e7-ac26-e7178e364679-builder-dockercfg-fs5z5-pull\") on node \"crc\" DevicePath \"\"" Mar 16 00:34:08 crc kubenswrapper[4816]: I0316 00:34:08.217471 4816 reconciler_common.go:293] "Volume detached for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/ff71ddfd-c6da-40e7-ac26-e7178e364679-node-pullsecrets\") on node \"crc\" DevicePath \"\"" Mar 16 00:34:08 crc kubenswrapper[4816]: I0316 00:34:08.217483 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c5ct9\" (UniqueName: \"kubernetes.io/projected/4786aa78-4870-43d7-a324-e3e3dd2c7943-kube-api-access-c5ct9\") on node \"crc\" DevicePath \"\"" Mar 16 00:34:08 crc kubenswrapper[4816]: I0316 00:34:08.217494 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9rhdb\" (UniqueName: \"kubernetes.io/projected/ff71ddfd-c6da-40e7-ac26-e7178e364679-kube-api-access-9rhdb\") on node \"crc\" DevicePath \"\"" Mar 16 00:34:08 crc kubenswrapper[4816]: I0316 00:34:08.217504 4816 reconciler_common.go:293] "Volume detached for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/ff71ddfd-c6da-40e7-ac26-e7178e364679-build-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 16 00:34:08 crc kubenswrapper[4816]: I0316 00:34:08.217515 4816 reconciler_common.go:293] "Volume detached for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/ff71ddfd-c6da-40e7-ac26-e7178e364679-buildcachedir\") on node \"crc\" DevicePath \"\"" Mar 16 00:34:08 crc kubenswrapper[4816]: I0316 00:34:08.657114 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ff71ddfd-c6da-40e7-ac26-e7178e364679-build-blob-cache" (OuterVolumeSpecName: "build-blob-cache") pod "ff71ddfd-c6da-40e7-ac26-e7178e364679" (UID: "ff71ddfd-c6da-40e7-ac26-e7178e364679"). InnerVolumeSpecName "build-blob-cache". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 16 00:34:08 crc kubenswrapper[4816]: I0316 00:34:08.667234 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29560354-nmflm" event={"ID":"4786aa78-4870-43d7-a324-e3e3dd2c7943","Type":"ContainerDied","Data":"86207657188b34b5bd103fcfbc19aacf027614812a3717aef241d2ae2885907a"} Mar 16 00:34:08 crc kubenswrapper[4816]: I0316 00:34:08.667256 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29560354-nmflm" Mar 16 00:34:08 crc kubenswrapper[4816]: I0316 00:34:08.667267 4816 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="86207657188b34b5bd103fcfbc19aacf027614812a3717aef241d2ae2885907a" Mar 16 00:34:08 crc kubenswrapper[4816]: I0316 00:34:08.670617 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-framework-index-1-build" event={"ID":"ff71ddfd-c6da-40e7-ac26-e7178e364679","Type":"ContainerDied","Data":"2c2104821392a6c5290ae6c58a4c84c015279512137a840ee8488010d57afbe2"} Mar 16 00:34:08 crc kubenswrapper[4816]: I0316 00:34:08.670670 4816 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2c2104821392a6c5290ae6c58a4c84c015279512137a840ee8488010d57afbe2" Mar 16 00:34:08 crc kubenswrapper[4816]: I0316 00:34:08.670768 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-framework-index-1-build" Mar 16 00:34:08 crc kubenswrapper[4816]: I0316 00:34:08.723434 4816 reconciler_common.go:293] "Volume detached for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/ff71ddfd-c6da-40e7-ac26-e7178e364679-build-blob-cache\") on node \"crc\" DevicePath \"\"" Mar 16 00:34:09 crc kubenswrapper[4816]: I0316 00:34:09.004607 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29560348-xvv6w"] Mar 16 00:34:09 crc kubenswrapper[4816]: I0316 00:34:09.009099 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29560348-xvv6w"] Mar 16 00:34:09 crc kubenswrapper[4816]: I0316 00:34:09.404669 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ff71ddfd-c6da-40e7-ac26-e7178e364679-container-storage-root" (OuterVolumeSpecName: "container-storage-root") pod "ff71ddfd-c6da-40e7-ac26-e7178e364679" (UID: "ff71ddfd-c6da-40e7-ac26-e7178e364679"). InnerVolumeSpecName "container-storage-root". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 16 00:34:09 crc kubenswrapper[4816]: I0316 00:34:09.433237 4816 reconciler_common.go:293] "Volume detached for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/ff71ddfd-c6da-40e7-ac26-e7178e364679-container-storage-root\") on node \"crc\" DevicePath \"\"" Mar 16 00:34:09 crc kubenswrapper[4816]: I0316 00:34:09.681216 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a529fd1f-66e5-4e49-b95a-18c6a8aade4b" path="/var/lib/kubelet/pods/a529fd1f-66e5-4e49-b95a-18c6a8aade4b/volumes" Mar 16 00:34:10 crc kubenswrapper[4816]: I0316 00:34:10.030929 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/infrawatch-operators-8g6sv"] Mar 16 00:34:10 crc kubenswrapper[4816]: E0316 00:34:10.031254 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4786aa78-4870-43d7-a324-e3e3dd2c7943" containerName="oc" Mar 16 00:34:10 crc kubenswrapper[4816]: I0316 00:34:10.031269 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="4786aa78-4870-43d7-a324-e3e3dd2c7943" containerName="oc" Mar 16 00:34:10 crc kubenswrapper[4816]: E0316 00:34:10.031284 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ff71ddfd-c6da-40e7-ac26-e7178e364679" containerName="manage-dockerfile" Mar 16 00:34:10 crc kubenswrapper[4816]: I0316 00:34:10.031292 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="ff71ddfd-c6da-40e7-ac26-e7178e364679" containerName="manage-dockerfile" Mar 16 00:34:10 crc kubenswrapper[4816]: E0316 00:34:10.031308 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ff71ddfd-c6da-40e7-ac26-e7178e364679" containerName="git-clone" Mar 16 00:34:10 crc kubenswrapper[4816]: I0316 00:34:10.031315 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="ff71ddfd-c6da-40e7-ac26-e7178e364679" containerName="git-clone" Mar 16 00:34:10 crc kubenswrapper[4816]: E0316 00:34:10.031327 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ff71ddfd-c6da-40e7-ac26-e7178e364679" containerName="docker-build" Mar 16 00:34:10 crc kubenswrapper[4816]: I0316 00:34:10.031335 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="ff71ddfd-c6da-40e7-ac26-e7178e364679" containerName="docker-build" Mar 16 00:34:10 crc kubenswrapper[4816]: I0316 00:34:10.031482 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="ff71ddfd-c6da-40e7-ac26-e7178e364679" containerName="docker-build" Mar 16 00:34:10 crc kubenswrapper[4816]: I0316 00:34:10.031495 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="4786aa78-4870-43d7-a324-e3e3dd2c7943" containerName="oc" Mar 16 00:34:10 crc kubenswrapper[4816]: I0316 00:34:10.032058 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/infrawatch-operators-8g6sv" Mar 16 00:34:10 crc kubenswrapper[4816]: I0316 00:34:10.036775 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/infrawatch-operators-8g6sv"] Mar 16 00:34:10 crc kubenswrapper[4816]: I0316 00:34:10.036889 4816 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"infrawatch-operators-dockercfg-m64wm" Mar 16 00:34:10 crc kubenswrapper[4816]: I0316 00:34:10.142216 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6skvx\" (UniqueName: \"kubernetes.io/projected/4abfb758-2aed-48a9-ab16-b7564942a72f-kube-api-access-6skvx\") pod \"infrawatch-operators-8g6sv\" (UID: \"4abfb758-2aed-48a9-ab16-b7564942a72f\") " pod="service-telemetry/infrawatch-operators-8g6sv" Mar 16 00:34:10 crc kubenswrapper[4816]: I0316 00:34:10.243668 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6skvx\" (UniqueName: \"kubernetes.io/projected/4abfb758-2aed-48a9-ab16-b7564942a72f-kube-api-access-6skvx\") pod \"infrawatch-operators-8g6sv\" (UID: \"4abfb758-2aed-48a9-ab16-b7564942a72f\") " pod="service-telemetry/infrawatch-operators-8g6sv" Mar 16 00:34:10 crc kubenswrapper[4816]: I0316 00:34:10.268526 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6skvx\" (UniqueName: \"kubernetes.io/projected/4abfb758-2aed-48a9-ab16-b7564942a72f-kube-api-access-6skvx\") pod \"infrawatch-operators-8g6sv\" (UID: \"4abfb758-2aed-48a9-ab16-b7564942a72f\") " pod="service-telemetry/infrawatch-operators-8g6sv" Mar 16 00:34:10 crc kubenswrapper[4816]: I0316 00:34:10.345821 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/infrawatch-operators-8g6sv" Mar 16 00:34:10 crc kubenswrapper[4816]: I0316 00:34:10.532856 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/infrawatch-operators-8g6sv"] Mar 16 00:34:10 crc kubenswrapper[4816]: I0316 00:34:10.689273 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/infrawatch-operators-8g6sv" event={"ID":"4abfb758-2aed-48a9-ab16-b7564942a72f","Type":"ContainerStarted","Data":"c0efc16eea6c4c5b32f050536aed0e0e924ed1b0fd0dfdea31edabd998b22a17"} Mar 16 00:34:14 crc kubenswrapper[4816]: I0316 00:34:14.825901 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["service-telemetry/infrawatch-operators-8g6sv"] Mar 16 00:34:15 crc kubenswrapper[4816]: I0316 00:34:15.626927 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/infrawatch-operators-fm45p"] Mar 16 00:34:15 crc kubenswrapper[4816]: I0316 00:34:15.629127 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/infrawatch-operators-fm45p" Mar 16 00:34:15 crc kubenswrapper[4816]: I0316 00:34:15.638333 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/infrawatch-operators-fm45p"] Mar 16 00:34:15 crc kubenswrapper[4816]: I0316 00:34:15.818180 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bvf2k\" (UniqueName: \"kubernetes.io/projected/ffeac517-cf5e-4a11-898c-8bee3a6e9ee3-kube-api-access-bvf2k\") pod \"infrawatch-operators-fm45p\" (UID: \"ffeac517-cf5e-4a11-898c-8bee3a6e9ee3\") " pod="service-telemetry/infrawatch-operators-fm45p" Mar 16 00:34:15 crc kubenswrapper[4816]: I0316 00:34:15.919728 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bvf2k\" (UniqueName: \"kubernetes.io/projected/ffeac517-cf5e-4a11-898c-8bee3a6e9ee3-kube-api-access-bvf2k\") pod \"infrawatch-operators-fm45p\" (UID: \"ffeac517-cf5e-4a11-898c-8bee3a6e9ee3\") " pod="service-telemetry/infrawatch-operators-fm45p" Mar 16 00:34:15 crc kubenswrapper[4816]: I0316 00:34:15.945655 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bvf2k\" (UniqueName: \"kubernetes.io/projected/ffeac517-cf5e-4a11-898c-8bee3a6e9ee3-kube-api-access-bvf2k\") pod \"infrawatch-operators-fm45p\" (UID: \"ffeac517-cf5e-4a11-898c-8bee3a6e9ee3\") " pod="service-telemetry/infrawatch-operators-fm45p" Mar 16 00:34:15 crc kubenswrapper[4816]: I0316 00:34:15.975113 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/infrawatch-operators-fm45p" Mar 16 00:34:20 crc kubenswrapper[4816]: I0316 00:34:20.957199 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/infrawatch-operators-fm45p"] Mar 16 00:34:21 crc kubenswrapper[4816]: I0316 00:34:21.777916 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/infrawatch-operators-fm45p" event={"ID":"ffeac517-cf5e-4a11-898c-8bee3a6e9ee3","Type":"ContainerStarted","Data":"6cf2f19e9690d3f84b8debbcca8669acb16b57dfcfbb47cd7268142a254e8a78"} Mar 16 00:34:21 crc kubenswrapper[4816]: I0316 00:34:21.778722 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/infrawatch-operators-fm45p" event={"ID":"ffeac517-cf5e-4a11-898c-8bee3a6e9ee3","Type":"ContainerStarted","Data":"227449b363bbf808a717b95b92e525ea3093476cd0c46bd8fa099fd2019d3718"} Mar 16 00:34:21 crc kubenswrapper[4816]: I0316 00:34:21.780042 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/infrawatch-operators-8g6sv" event={"ID":"4abfb758-2aed-48a9-ab16-b7564942a72f","Type":"ContainerStarted","Data":"3fa73c35b4a3181ba735935cd685780824124679b15b259f19936e9a76792aab"} Mar 16 00:34:21 crc kubenswrapper[4816]: I0316 00:34:21.780158 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="service-telemetry/infrawatch-operators-8g6sv" podUID="4abfb758-2aed-48a9-ab16-b7564942a72f" containerName="registry-server" containerID="cri-o://3fa73c35b4a3181ba735935cd685780824124679b15b259f19936e9a76792aab" gracePeriod=2 Mar 16 00:34:21 crc kubenswrapper[4816]: I0316 00:34:21.797660 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/infrawatch-operators-fm45p" podStartSLOduration=6.642028191 podStartE2EDuration="6.797634904s" podCreationTimestamp="2026-03-16 00:34:15 +0000 UTC" firstStartedPulling="2026-03-16 00:34:20.995680519 +0000 UTC m=+1654.091980472" lastFinishedPulling="2026-03-16 00:34:21.151287232 +0000 UTC m=+1654.247587185" observedRunningTime="2026-03-16 00:34:21.792219544 +0000 UTC m=+1654.888519497" watchObservedRunningTime="2026-03-16 00:34:21.797634904 +0000 UTC m=+1654.893934857" Mar 16 00:34:21 crc kubenswrapper[4816]: I0316 00:34:21.813694 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/infrawatch-operators-8g6sv" podStartSLOduration=1.373134268 podStartE2EDuration="11.813667088s" podCreationTimestamp="2026-03-16 00:34:10 +0000 UTC" firstStartedPulling="2026-03-16 00:34:10.541032878 +0000 UTC m=+1643.637332831" lastFinishedPulling="2026-03-16 00:34:20.981565698 +0000 UTC m=+1654.077865651" observedRunningTime="2026-03-16 00:34:21.81229881 +0000 UTC m=+1654.908598773" watchObservedRunningTime="2026-03-16 00:34:21.813667088 +0000 UTC m=+1654.909967041" Mar 16 00:34:22 crc kubenswrapper[4816]: I0316 00:34:22.200676 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/infrawatch-operators-8g6sv" Mar 16 00:34:22 crc kubenswrapper[4816]: I0316 00:34:22.204981 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6skvx\" (UniqueName: \"kubernetes.io/projected/4abfb758-2aed-48a9-ab16-b7564942a72f-kube-api-access-6skvx\") pod \"4abfb758-2aed-48a9-ab16-b7564942a72f\" (UID: \"4abfb758-2aed-48a9-ab16-b7564942a72f\") " Mar 16 00:34:22 crc kubenswrapper[4816]: I0316 00:34:22.213890 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4abfb758-2aed-48a9-ab16-b7564942a72f-kube-api-access-6skvx" (OuterVolumeSpecName: "kube-api-access-6skvx") pod "4abfb758-2aed-48a9-ab16-b7564942a72f" (UID: "4abfb758-2aed-48a9-ab16-b7564942a72f"). InnerVolumeSpecName "kube-api-access-6skvx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:34:22 crc kubenswrapper[4816]: I0316 00:34:22.306366 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6skvx\" (UniqueName: \"kubernetes.io/projected/4abfb758-2aed-48a9-ab16-b7564942a72f-kube-api-access-6skvx\") on node \"crc\" DevicePath \"\"" Mar 16 00:34:22 crc kubenswrapper[4816]: I0316 00:34:22.786763 4816 generic.go:334] "Generic (PLEG): container finished" podID="4abfb758-2aed-48a9-ab16-b7564942a72f" containerID="3fa73c35b4a3181ba735935cd685780824124679b15b259f19936e9a76792aab" exitCode=0 Mar 16 00:34:22 crc kubenswrapper[4816]: I0316 00:34:22.786832 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/infrawatch-operators-8g6sv" Mar 16 00:34:22 crc kubenswrapper[4816]: I0316 00:34:22.786885 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/infrawatch-operators-8g6sv" event={"ID":"4abfb758-2aed-48a9-ab16-b7564942a72f","Type":"ContainerDied","Data":"3fa73c35b4a3181ba735935cd685780824124679b15b259f19936e9a76792aab"} Mar 16 00:34:22 crc kubenswrapper[4816]: I0316 00:34:22.786927 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/infrawatch-operators-8g6sv" event={"ID":"4abfb758-2aed-48a9-ab16-b7564942a72f","Type":"ContainerDied","Data":"c0efc16eea6c4c5b32f050536aed0e0e924ed1b0fd0dfdea31edabd998b22a17"} Mar 16 00:34:22 crc kubenswrapper[4816]: I0316 00:34:22.786949 4816 scope.go:117] "RemoveContainer" containerID="3fa73c35b4a3181ba735935cd685780824124679b15b259f19936e9a76792aab" Mar 16 00:34:22 crc kubenswrapper[4816]: I0316 00:34:22.811236 4816 scope.go:117] "RemoveContainer" containerID="3fa73c35b4a3181ba735935cd685780824124679b15b259f19936e9a76792aab" Mar 16 00:34:22 crc kubenswrapper[4816]: E0316 00:34:22.811689 4816 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3fa73c35b4a3181ba735935cd685780824124679b15b259f19936e9a76792aab\": container with ID starting with 3fa73c35b4a3181ba735935cd685780824124679b15b259f19936e9a76792aab not found: ID does not exist" containerID="3fa73c35b4a3181ba735935cd685780824124679b15b259f19936e9a76792aab" Mar 16 00:34:22 crc kubenswrapper[4816]: I0316 00:34:22.811730 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3fa73c35b4a3181ba735935cd685780824124679b15b259f19936e9a76792aab"} err="failed to get container status \"3fa73c35b4a3181ba735935cd685780824124679b15b259f19936e9a76792aab\": rpc error: code = NotFound desc = could not find container \"3fa73c35b4a3181ba735935cd685780824124679b15b259f19936e9a76792aab\": container with ID starting with 3fa73c35b4a3181ba735935cd685780824124679b15b259f19936e9a76792aab not found: ID does not exist" Mar 16 00:34:22 crc kubenswrapper[4816]: I0316 00:34:22.812984 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["service-telemetry/infrawatch-operators-8g6sv"] Mar 16 00:34:22 crc kubenswrapper[4816]: I0316 00:34:22.819371 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["service-telemetry/infrawatch-operators-8g6sv"] Mar 16 00:34:23 crc kubenswrapper[4816]: I0316 00:34:23.678455 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4abfb758-2aed-48a9-ab16-b7564942a72f" path="/var/lib/kubelet/pods/4abfb758-2aed-48a9-ab16-b7564942a72f/volumes" Mar 16 00:34:25 crc kubenswrapper[4816]: I0316 00:34:25.976115 4816 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="service-telemetry/infrawatch-operators-fm45p" Mar 16 00:34:25 crc kubenswrapper[4816]: I0316 00:34:25.976211 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="service-telemetry/infrawatch-operators-fm45p" Mar 16 00:34:26 crc kubenswrapper[4816]: I0316 00:34:26.001406 4816 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="service-telemetry/infrawatch-operators-fm45p" Mar 16 00:34:26 crc kubenswrapper[4816]: I0316 00:34:26.848109 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="service-telemetry/infrawatch-operators-fm45p" Mar 16 00:34:31 crc kubenswrapper[4816]: I0316 00:34:31.885838 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/500c4f010310dad14c569d8fa2124fef1cf701af50ed1128cec4daf65a9tbth"] Mar 16 00:34:31 crc kubenswrapper[4816]: E0316 00:34:31.886730 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4abfb758-2aed-48a9-ab16-b7564942a72f" containerName="registry-server" Mar 16 00:34:31 crc kubenswrapper[4816]: I0316 00:34:31.886752 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="4abfb758-2aed-48a9-ab16-b7564942a72f" containerName="registry-server" Mar 16 00:34:31 crc kubenswrapper[4816]: I0316 00:34:31.886947 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="4abfb758-2aed-48a9-ab16-b7564942a72f" containerName="registry-server" Mar 16 00:34:31 crc kubenswrapper[4816]: I0316 00:34:31.888832 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/500c4f010310dad14c569d8fa2124fef1cf701af50ed1128cec4daf65a9tbth" Mar 16 00:34:31 crc kubenswrapper[4816]: I0316 00:34:31.897125 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/500c4f010310dad14c569d8fa2124fef1cf701af50ed1128cec4daf65a9tbth"] Mar 16 00:34:32 crc kubenswrapper[4816]: I0316 00:34:32.023927 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/0b78944c-c894-4d7f-bbe3-82eee916db70-util\") pod \"500c4f010310dad14c569d8fa2124fef1cf701af50ed1128cec4daf65a9tbth\" (UID: \"0b78944c-c894-4d7f-bbe3-82eee916db70\") " pod="service-telemetry/500c4f010310dad14c569d8fa2124fef1cf701af50ed1128cec4daf65a9tbth" Mar 16 00:34:32 crc kubenswrapper[4816]: I0316 00:34:32.023990 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qkzgc\" (UniqueName: \"kubernetes.io/projected/0b78944c-c894-4d7f-bbe3-82eee916db70-kube-api-access-qkzgc\") pod \"500c4f010310dad14c569d8fa2124fef1cf701af50ed1128cec4daf65a9tbth\" (UID: \"0b78944c-c894-4d7f-bbe3-82eee916db70\") " pod="service-telemetry/500c4f010310dad14c569d8fa2124fef1cf701af50ed1128cec4daf65a9tbth" Mar 16 00:34:32 crc kubenswrapper[4816]: I0316 00:34:32.024065 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/0b78944c-c894-4d7f-bbe3-82eee916db70-bundle\") pod \"500c4f010310dad14c569d8fa2124fef1cf701af50ed1128cec4daf65a9tbth\" (UID: \"0b78944c-c894-4d7f-bbe3-82eee916db70\") " pod="service-telemetry/500c4f010310dad14c569d8fa2124fef1cf701af50ed1128cec4daf65a9tbth" Mar 16 00:34:32 crc kubenswrapper[4816]: I0316 00:34:32.124976 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/0b78944c-c894-4d7f-bbe3-82eee916db70-bundle\") pod \"500c4f010310dad14c569d8fa2124fef1cf701af50ed1128cec4daf65a9tbth\" (UID: \"0b78944c-c894-4d7f-bbe3-82eee916db70\") " pod="service-telemetry/500c4f010310dad14c569d8fa2124fef1cf701af50ed1128cec4daf65a9tbth" Mar 16 00:34:32 crc kubenswrapper[4816]: I0316 00:34:32.125078 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/0b78944c-c894-4d7f-bbe3-82eee916db70-util\") pod \"500c4f010310dad14c569d8fa2124fef1cf701af50ed1128cec4daf65a9tbth\" (UID: \"0b78944c-c894-4d7f-bbe3-82eee916db70\") " pod="service-telemetry/500c4f010310dad14c569d8fa2124fef1cf701af50ed1128cec4daf65a9tbth" Mar 16 00:34:32 crc kubenswrapper[4816]: I0316 00:34:32.125101 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qkzgc\" (UniqueName: \"kubernetes.io/projected/0b78944c-c894-4d7f-bbe3-82eee916db70-kube-api-access-qkzgc\") pod \"500c4f010310dad14c569d8fa2124fef1cf701af50ed1128cec4daf65a9tbth\" (UID: \"0b78944c-c894-4d7f-bbe3-82eee916db70\") " pod="service-telemetry/500c4f010310dad14c569d8fa2124fef1cf701af50ed1128cec4daf65a9tbth" Mar 16 00:34:32 crc kubenswrapper[4816]: I0316 00:34:32.126063 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/0b78944c-c894-4d7f-bbe3-82eee916db70-bundle\") pod \"500c4f010310dad14c569d8fa2124fef1cf701af50ed1128cec4daf65a9tbth\" (UID: \"0b78944c-c894-4d7f-bbe3-82eee916db70\") " pod="service-telemetry/500c4f010310dad14c569d8fa2124fef1cf701af50ed1128cec4daf65a9tbth" Mar 16 00:34:32 crc kubenswrapper[4816]: I0316 00:34:32.126366 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/0b78944c-c894-4d7f-bbe3-82eee916db70-util\") pod \"500c4f010310dad14c569d8fa2124fef1cf701af50ed1128cec4daf65a9tbth\" (UID: \"0b78944c-c894-4d7f-bbe3-82eee916db70\") " pod="service-telemetry/500c4f010310dad14c569d8fa2124fef1cf701af50ed1128cec4daf65a9tbth" Mar 16 00:34:32 crc kubenswrapper[4816]: I0316 00:34:32.142063 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qkzgc\" (UniqueName: \"kubernetes.io/projected/0b78944c-c894-4d7f-bbe3-82eee916db70-kube-api-access-qkzgc\") pod \"500c4f010310dad14c569d8fa2124fef1cf701af50ed1128cec4daf65a9tbth\" (UID: \"0b78944c-c894-4d7f-bbe3-82eee916db70\") " pod="service-telemetry/500c4f010310dad14c569d8fa2124fef1cf701af50ed1128cec4daf65a9tbth" Mar 16 00:34:32 crc kubenswrapper[4816]: I0316 00:34:32.208805 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/500c4f010310dad14c569d8fa2124fef1cf701af50ed1128cec4daf65a9tbth" Mar 16 00:34:32 crc kubenswrapper[4816]: I0316 00:34:32.627518 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/500c4f010310dad14c569d8fa2124fef1cf701af50ed1128cec4daf65a9tbth"] Mar 16 00:34:32 crc kubenswrapper[4816]: I0316 00:34:32.854162 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/500c4f010310dad14c569d8fa2124fef1cf701af50ed1128cec4daf65a9tbth" event={"ID":"0b78944c-c894-4d7f-bbe3-82eee916db70","Type":"ContainerStarted","Data":"6b420a3589052a6db1afcf2db384d95e64c97ce42dcee5c381e8a7882749e59b"} Mar 16 00:34:32 crc kubenswrapper[4816]: I0316 00:34:32.854214 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/500c4f010310dad14c569d8fa2124fef1cf701af50ed1128cec4daf65a9tbth" event={"ID":"0b78944c-c894-4d7f-bbe3-82eee916db70","Type":"ContainerStarted","Data":"d790c89c5f398f362a0ba954ed74496a86b4d207075a97a2dba1f9ba8fe354b1"} Mar 16 00:34:32 crc kubenswrapper[4816]: I0316 00:34:32.919113 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/372e7d5daac88c2e9a91443a2f508c8c20ad57bc41b1606ec960d61c09j6cjz"] Mar 16 00:34:32 crc kubenswrapper[4816]: I0316 00:34:32.921168 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/372e7d5daac88c2e9a91443a2f508c8c20ad57bc41b1606ec960d61c09j6cjz" Mar 16 00:34:32 crc kubenswrapper[4816]: I0316 00:34:32.932533 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/372e7d5daac88c2e9a91443a2f508c8c20ad57bc41b1606ec960d61c09j6cjz"] Mar 16 00:34:32 crc kubenswrapper[4816]: I0316 00:34:32.934008 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f6754fbe-ac20-4fe6-8c87-6d30f20069b9-util\") pod \"372e7d5daac88c2e9a91443a2f508c8c20ad57bc41b1606ec960d61c09j6cjz\" (UID: \"f6754fbe-ac20-4fe6-8c87-6d30f20069b9\") " pod="service-telemetry/372e7d5daac88c2e9a91443a2f508c8c20ad57bc41b1606ec960d61c09j6cjz" Mar 16 00:34:32 crc kubenswrapper[4816]: I0316 00:34:32.934064 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f6754fbe-ac20-4fe6-8c87-6d30f20069b9-bundle\") pod \"372e7d5daac88c2e9a91443a2f508c8c20ad57bc41b1606ec960d61c09j6cjz\" (UID: \"f6754fbe-ac20-4fe6-8c87-6d30f20069b9\") " pod="service-telemetry/372e7d5daac88c2e9a91443a2f508c8c20ad57bc41b1606ec960d61c09j6cjz" Mar 16 00:34:32 crc kubenswrapper[4816]: I0316 00:34:32.934106 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pf99h\" (UniqueName: \"kubernetes.io/projected/f6754fbe-ac20-4fe6-8c87-6d30f20069b9-kube-api-access-pf99h\") pod \"372e7d5daac88c2e9a91443a2f508c8c20ad57bc41b1606ec960d61c09j6cjz\" (UID: \"f6754fbe-ac20-4fe6-8c87-6d30f20069b9\") " pod="service-telemetry/372e7d5daac88c2e9a91443a2f508c8c20ad57bc41b1606ec960d61c09j6cjz" Mar 16 00:34:33 crc kubenswrapper[4816]: I0316 00:34:33.035173 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f6754fbe-ac20-4fe6-8c87-6d30f20069b9-util\") pod \"372e7d5daac88c2e9a91443a2f508c8c20ad57bc41b1606ec960d61c09j6cjz\" (UID: \"f6754fbe-ac20-4fe6-8c87-6d30f20069b9\") " pod="service-telemetry/372e7d5daac88c2e9a91443a2f508c8c20ad57bc41b1606ec960d61c09j6cjz" Mar 16 00:34:33 crc kubenswrapper[4816]: I0316 00:34:33.035275 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f6754fbe-ac20-4fe6-8c87-6d30f20069b9-bundle\") pod \"372e7d5daac88c2e9a91443a2f508c8c20ad57bc41b1606ec960d61c09j6cjz\" (UID: \"f6754fbe-ac20-4fe6-8c87-6d30f20069b9\") " pod="service-telemetry/372e7d5daac88c2e9a91443a2f508c8c20ad57bc41b1606ec960d61c09j6cjz" Mar 16 00:34:33 crc kubenswrapper[4816]: I0316 00:34:33.035319 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pf99h\" (UniqueName: \"kubernetes.io/projected/f6754fbe-ac20-4fe6-8c87-6d30f20069b9-kube-api-access-pf99h\") pod \"372e7d5daac88c2e9a91443a2f508c8c20ad57bc41b1606ec960d61c09j6cjz\" (UID: \"f6754fbe-ac20-4fe6-8c87-6d30f20069b9\") " pod="service-telemetry/372e7d5daac88c2e9a91443a2f508c8c20ad57bc41b1606ec960d61c09j6cjz" Mar 16 00:34:33 crc kubenswrapper[4816]: I0316 00:34:33.035842 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f6754fbe-ac20-4fe6-8c87-6d30f20069b9-bundle\") pod \"372e7d5daac88c2e9a91443a2f508c8c20ad57bc41b1606ec960d61c09j6cjz\" (UID: \"f6754fbe-ac20-4fe6-8c87-6d30f20069b9\") " pod="service-telemetry/372e7d5daac88c2e9a91443a2f508c8c20ad57bc41b1606ec960d61c09j6cjz" Mar 16 00:34:33 crc kubenswrapper[4816]: I0316 00:34:33.036028 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f6754fbe-ac20-4fe6-8c87-6d30f20069b9-util\") pod \"372e7d5daac88c2e9a91443a2f508c8c20ad57bc41b1606ec960d61c09j6cjz\" (UID: \"f6754fbe-ac20-4fe6-8c87-6d30f20069b9\") " pod="service-telemetry/372e7d5daac88c2e9a91443a2f508c8c20ad57bc41b1606ec960d61c09j6cjz" Mar 16 00:34:33 crc kubenswrapper[4816]: I0316 00:34:33.053842 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pf99h\" (UniqueName: \"kubernetes.io/projected/f6754fbe-ac20-4fe6-8c87-6d30f20069b9-kube-api-access-pf99h\") pod \"372e7d5daac88c2e9a91443a2f508c8c20ad57bc41b1606ec960d61c09j6cjz\" (UID: \"f6754fbe-ac20-4fe6-8c87-6d30f20069b9\") " pod="service-telemetry/372e7d5daac88c2e9a91443a2f508c8c20ad57bc41b1606ec960d61c09j6cjz" Mar 16 00:34:33 crc kubenswrapper[4816]: I0316 00:34:33.301122 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/372e7d5daac88c2e9a91443a2f508c8c20ad57bc41b1606ec960d61c09j6cjz" Mar 16 00:34:33 crc kubenswrapper[4816]: I0316 00:34:33.788120 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/372e7d5daac88c2e9a91443a2f508c8c20ad57bc41b1606ec960d61c09j6cjz"] Mar 16 00:34:33 crc kubenswrapper[4816]: I0316 00:34:33.866681 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/372e7d5daac88c2e9a91443a2f508c8c20ad57bc41b1606ec960d61c09j6cjz" event={"ID":"f6754fbe-ac20-4fe6-8c87-6d30f20069b9","Type":"ContainerStarted","Data":"add5543c2b87b62fcb6cc941f4e9d34042291c66893841625339e4cb436c7477"} Mar 16 00:34:33 crc kubenswrapper[4816]: I0316 00:34:33.867961 4816 generic.go:334] "Generic (PLEG): container finished" podID="0b78944c-c894-4d7f-bbe3-82eee916db70" containerID="6b420a3589052a6db1afcf2db384d95e64c97ce42dcee5c381e8a7882749e59b" exitCode=0 Mar 16 00:34:33 crc kubenswrapper[4816]: I0316 00:34:33.868001 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/500c4f010310dad14c569d8fa2124fef1cf701af50ed1128cec4daf65a9tbth" event={"ID":"0b78944c-c894-4d7f-bbe3-82eee916db70","Type":"ContainerDied","Data":"6b420a3589052a6db1afcf2db384d95e64c97ce42dcee5c381e8a7882749e59b"} Mar 16 00:34:34 crc kubenswrapper[4816]: I0316 00:34:34.876774 4816 generic.go:334] "Generic (PLEG): container finished" podID="f6754fbe-ac20-4fe6-8c87-6d30f20069b9" containerID="18a1d999c028a245479511118fd18d5623e31bf0ba1ed36a8923351ebd56c713" exitCode=0 Mar 16 00:34:34 crc kubenswrapper[4816]: I0316 00:34:34.876867 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/372e7d5daac88c2e9a91443a2f508c8c20ad57bc41b1606ec960d61c09j6cjz" event={"ID":"f6754fbe-ac20-4fe6-8c87-6d30f20069b9","Type":"ContainerDied","Data":"18a1d999c028a245479511118fd18d5623e31bf0ba1ed36a8923351ebd56c713"} Mar 16 00:34:34 crc kubenswrapper[4816]: I0316 00:34:34.879124 4816 generic.go:334] "Generic (PLEG): container finished" podID="0b78944c-c894-4d7f-bbe3-82eee916db70" containerID="5d870941f7b3e569fa84c49a82db1748056ec3fdd6cc2b23128235b07e7d93c9" exitCode=0 Mar 16 00:34:34 crc kubenswrapper[4816]: I0316 00:34:34.879151 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/500c4f010310dad14c569d8fa2124fef1cf701af50ed1128cec4daf65a9tbth" event={"ID":"0b78944c-c894-4d7f-bbe3-82eee916db70","Type":"ContainerDied","Data":"5d870941f7b3e569fa84c49a82db1748056ec3fdd6cc2b23128235b07e7d93c9"} Mar 16 00:34:35 crc kubenswrapper[4816]: I0316 00:34:35.886392 4816 generic.go:334] "Generic (PLEG): container finished" podID="0b78944c-c894-4d7f-bbe3-82eee916db70" containerID="9c05ec3de572be28234263125f6671d9ce90d8114ad7a91af13eef9b53e34db1" exitCode=0 Mar 16 00:34:35 crc kubenswrapper[4816]: I0316 00:34:35.886669 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/500c4f010310dad14c569d8fa2124fef1cf701af50ed1128cec4daf65a9tbth" event={"ID":"0b78944c-c894-4d7f-bbe3-82eee916db70","Type":"ContainerDied","Data":"9c05ec3de572be28234263125f6671d9ce90d8114ad7a91af13eef9b53e34db1"} Mar 16 00:34:35 crc kubenswrapper[4816]: I0316 00:34:35.888093 4816 generic.go:334] "Generic (PLEG): container finished" podID="f6754fbe-ac20-4fe6-8c87-6d30f20069b9" containerID="725cfe790af8de5097674d7a212a554bdd9fec74a150c10cb54be0bcef0edf7a" exitCode=0 Mar 16 00:34:35 crc kubenswrapper[4816]: I0316 00:34:35.888144 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/372e7d5daac88c2e9a91443a2f508c8c20ad57bc41b1606ec960d61c09j6cjz" event={"ID":"f6754fbe-ac20-4fe6-8c87-6d30f20069b9","Type":"ContainerDied","Data":"725cfe790af8de5097674d7a212a554bdd9fec74a150c10cb54be0bcef0edf7a"} Mar 16 00:34:36 crc kubenswrapper[4816]: I0316 00:34:36.923953 4816 generic.go:334] "Generic (PLEG): container finished" podID="f6754fbe-ac20-4fe6-8c87-6d30f20069b9" containerID="f69b2062e753e9aeeaba47277e0394d00ff23e47737c987053cc8946cc8b6c75" exitCode=0 Mar 16 00:34:36 crc kubenswrapper[4816]: I0316 00:34:36.925156 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/372e7d5daac88c2e9a91443a2f508c8c20ad57bc41b1606ec960d61c09j6cjz" event={"ID":"f6754fbe-ac20-4fe6-8c87-6d30f20069b9","Type":"ContainerDied","Data":"f69b2062e753e9aeeaba47277e0394d00ff23e47737c987053cc8946cc8b6c75"} Mar 16 00:34:37 crc kubenswrapper[4816]: I0316 00:34:37.211445 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/500c4f010310dad14c569d8fa2124fef1cf701af50ed1128cec4daf65a9tbth" Mar 16 00:34:37 crc kubenswrapper[4816]: I0316 00:34:37.392389 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/0b78944c-c894-4d7f-bbe3-82eee916db70-bundle\") pod \"0b78944c-c894-4d7f-bbe3-82eee916db70\" (UID: \"0b78944c-c894-4d7f-bbe3-82eee916db70\") " Mar 16 00:34:37 crc kubenswrapper[4816]: I0316 00:34:37.392522 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/0b78944c-c894-4d7f-bbe3-82eee916db70-util\") pod \"0b78944c-c894-4d7f-bbe3-82eee916db70\" (UID: \"0b78944c-c894-4d7f-bbe3-82eee916db70\") " Mar 16 00:34:37 crc kubenswrapper[4816]: I0316 00:34:37.393048 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0b78944c-c894-4d7f-bbe3-82eee916db70-bundle" (OuterVolumeSpecName: "bundle") pod "0b78944c-c894-4d7f-bbe3-82eee916db70" (UID: "0b78944c-c894-4d7f-bbe3-82eee916db70"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 16 00:34:37 crc kubenswrapper[4816]: I0316 00:34:37.397736 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qkzgc\" (UniqueName: \"kubernetes.io/projected/0b78944c-c894-4d7f-bbe3-82eee916db70-kube-api-access-qkzgc\") pod \"0b78944c-c894-4d7f-bbe3-82eee916db70\" (UID: \"0b78944c-c894-4d7f-bbe3-82eee916db70\") " Mar 16 00:34:37 crc kubenswrapper[4816]: I0316 00:34:37.398363 4816 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/0b78944c-c894-4d7f-bbe3-82eee916db70-bundle\") on node \"crc\" DevicePath \"\"" Mar 16 00:34:37 crc kubenswrapper[4816]: I0316 00:34:37.402106 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b78944c-c894-4d7f-bbe3-82eee916db70-kube-api-access-qkzgc" (OuterVolumeSpecName: "kube-api-access-qkzgc") pod "0b78944c-c894-4d7f-bbe3-82eee916db70" (UID: "0b78944c-c894-4d7f-bbe3-82eee916db70"). InnerVolumeSpecName "kube-api-access-qkzgc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:34:37 crc kubenswrapper[4816]: I0316 00:34:37.413900 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0b78944c-c894-4d7f-bbe3-82eee916db70-util" (OuterVolumeSpecName: "util") pod "0b78944c-c894-4d7f-bbe3-82eee916db70" (UID: "0b78944c-c894-4d7f-bbe3-82eee916db70"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 16 00:34:37 crc kubenswrapper[4816]: I0316 00:34:37.499080 4816 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/0b78944c-c894-4d7f-bbe3-82eee916db70-util\") on node \"crc\" DevicePath \"\"" Mar 16 00:34:37 crc kubenswrapper[4816]: I0316 00:34:37.499121 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qkzgc\" (UniqueName: \"kubernetes.io/projected/0b78944c-c894-4d7f-bbe3-82eee916db70-kube-api-access-qkzgc\") on node \"crc\" DevicePath \"\"" Mar 16 00:34:37 crc kubenswrapper[4816]: I0316 00:34:37.934538 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/500c4f010310dad14c569d8fa2124fef1cf701af50ed1128cec4daf65a9tbth" Mar 16 00:34:37 crc kubenswrapper[4816]: I0316 00:34:37.934770 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/500c4f010310dad14c569d8fa2124fef1cf701af50ed1128cec4daf65a9tbth" event={"ID":"0b78944c-c894-4d7f-bbe3-82eee916db70","Type":"ContainerDied","Data":"d790c89c5f398f362a0ba954ed74496a86b4d207075a97a2dba1f9ba8fe354b1"} Mar 16 00:34:37 crc kubenswrapper[4816]: I0316 00:34:37.934824 4816 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d790c89c5f398f362a0ba954ed74496a86b4d207075a97a2dba1f9ba8fe354b1" Mar 16 00:34:38 crc kubenswrapper[4816]: I0316 00:34:38.234317 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/372e7d5daac88c2e9a91443a2f508c8c20ad57bc41b1606ec960d61c09j6cjz" Mar 16 00:34:38 crc kubenswrapper[4816]: I0316 00:34:38.306898 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f6754fbe-ac20-4fe6-8c87-6d30f20069b9-util\") pod \"f6754fbe-ac20-4fe6-8c87-6d30f20069b9\" (UID: \"f6754fbe-ac20-4fe6-8c87-6d30f20069b9\") " Mar 16 00:34:38 crc kubenswrapper[4816]: I0316 00:34:38.306958 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pf99h\" (UniqueName: \"kubernetes.io/projected/f6754fbe-ac20-4fe6-8c87-6d30f20069b9-kube-api-access-pf99h\") pod \"f6754fbe-ac20-4fe6-8c87-6d30f20069b9\" (UID: \"f6754fbe-ac20-4fe6-8c87-6d30f20069b9\") " Mar 16 00:34:38 crc kubenswrapper[4816]: I0316 00:34:38.306994 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f6754fbe-ac20-4fe6-8c87-6d30f20069b9-bundle\") pod \"f6754fbe-ac20-4fe6-8c87-6d30f20069b9\" (UID: \"f6754fbe-ac20-4fe6-8c87-6d30f20069b9\") " Mar 16 00:34:38 crc kubenswrapper[4816]: I0316 00:34:38.307646 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f6754fbe-ac20-4fe6-8c87-6d30f20069b9-bundle" (OuterVolumeSpecName: "bundle") pod "f6754fbe-ac20-4fe6-8c87-6d30f20069b9" (UID: "f6754fbe-ac20-4fe6-8c87-6d30f20069b9"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 16 00:34:38 crc kubenswrapper[4816]: I0316 00:34:38.316935 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f6754fbe-ac20-4fe6-8c87-6d30f20069b9-kube-api-access-pf99h" (OuterVolumeSpecName: "kube-api-access-pf99h") pod "f6754fbe-ac20-4fe6-8c87-6d30f20069b9" (UID: "f6754fbe-ac20-4fe6-8c87-6d30f20069b9"). InnerVolumeSpecName "kube-api-access-pf99h". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:34:38 crc kubenswrapper[4816]: I0316 00:34:38.321429 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f6754fbe-ac20-4fe6-8c87-6d30f20069b9-util" (OuterVolumeSpecName: "util") pod "f6754fbe-ac20-4fe6-8c87-6d30f20069b9" (UID: "f6754fbe-ac20-4fe6-8c87-6d30f20069b9"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 16 00:34:38 crc kubenswrapper[4816]: I0316 00:34:38.408027 4816 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f6754fbe-ac20-4fe6-8c87-6d30f20069b9-util\") on node \"crc\" DevicePath \"\"" Mar 16 00:34:38 crc kubenswrapper[4816]: I0316 00:34:38.408062 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pf99h\" (UniqueName: \"kubernetes.io/projected/f6754fbe-ac20-4fe6-8c87-6d30f20069b9-kube-api-access-pf99h\") on node \"crc\" DevicePath \"\"" Mar 16 00:34:38 crc kubenswrapper[4816]: I0316 00:34:38.408074 4816 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f6754fbe-ac20-4fe6-8c87-6d30f20069b9-bundle\") on node \"crc\" DevicePath \"\"" Mar 16 00:34:38 crc kubenswrapper[4816]: I0316 00:34:38.944530 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/372e7d5daac88c2e9a91443a2f508c8c20ad57bc41b1606ec960d61c09j6cjz" event={"ID":"f6754fbe-ac20-4fe6-8c87-6d30f20069b9","Type":"ContainerDied","Data":"add5543c2b87b62fcb6cc941f4e9d34042291c66893841625339e4cb436c7477"} Mar 16 00:34:38 crc kubenswrapper[4816]: I0316 00:34:38.944901 4816 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="add5543c2b87b62fcb6cc941f4e9d34042291c66893841625339e4cb436c7477" Mar 16 00:34:38 crc kubenswrapper[4816]: I0316 00:34:38.944656 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/372e7d5daac88c2e9a91443a2f508c8c20ad57bc41b1606ec960d61c09j6cjz" Mar 16 00:34:44 crc kubenswrapper[4816]: I0316 00:34:44.747030 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/smart-gateway-operator-55c8479bdf-6m44w"] Mar 16 00:34:44 crc kubenswrapper[4816]: E0316 00:34:44.747577 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0b78944c-c894-4d7f-bbe3-82eee916db70" containerName="util" Mar 16 00:34:44 crc kubenswrapper[4816]: I0316 00:34:44.747595 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="0b78944c-c894-4d7f-bbe3-82eee916db70" containerName="util" Mar 16 00:34:44 crc kubenswrapper[4816]: E0316 00:34:44.747607 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0b78944c-c894-4d7f-bbe3-82eee916db70" containerName="pull" Mar 16 00:34:44 crc kubenswrapper[4816]: I0316 00:34:44.747615 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="0b78944c-c894-4d7f-bbe3-82eee916db70" containerName="pull" Mar 16 00:34:44 crc kubenswrapper[4816]: E0316 00:34:44.747628 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0b78944c-c894-4d7f-bbe3-82eee916db70" containerName="extract" Mar 16 00:34:44 crc kubenswrapper[4816]: I0316 00:34:44.747635 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="0b78944c-c894-4d7f-bbe3-82eee916db70" containerName="extract" Mar 16 00:34:44 crc kubenswrapper[4816]: E0316 00:34:44.747651 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f6754fbe-ac20-4fe6-8c87-6d30f20069b9" containerName="util" Mar 16 00:34:44 crc kubenswrapper[4816]: I0316 00:34:44.747657 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="f6754fbe-ac20-4fe6-8c87-6d30f20069b9" containerName="util" Mar 16 00:34:44 crc kubenswrapper[4816]: E0316 00:34:44.747672 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f6754fbe-ac20-4fe6-8c87-6d30f20069b9" containerName="pull" Mar 16 00:34:44 crc kubenswrapper[4816]: I0316 00:34:44.747678 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="f6754fbe-ac20-4fe6-8c87-6d30f20069b9" containerName="pull" Mar 16 00:34:44 crc kubenswrapper[4816]: E0316 00:34:44.747686 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f6754fbe-ac20-4fe6-8c87-6d30f20069b9" containerName="extract" Mar 16 00:34:44 crc kubenswrapper[4816]: I0316 00:34:44.747695 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="f6754fbe-ac20-4fe6-8c87-6d30f20069b9" containerName="extract" Mar 16 00:34:44 crc kubenswrapper[4816]: I0316 00:34:44.747813 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="0b78944c-c894-4d7f-bbe3-82eee916db70" containerName="extract" Mar 16 00:34:44 crc kubenswrapper[4816]: I0316 00:34:44.747828 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="f6754fbe-ac20-4fe6-8c87-6d30f20069b9" containerName="extract" Mar 16 00:34:44 crc kubenswrapper[4816]: I0316 00:34:44.748317 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/smart-gateway-operator-55c8479bdf-6m44w" Mar 16 00:34:44 crc kubenswrapper[4816]: I0316 00:34:44.750573 4816 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"smart-gateway-operator-dockercfg-95tfc" Mar 16 00:34:44 crc kubenswrapper[4816]: I0316 00:34:44.761725 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/smart-gateway-operator-55c8479bdf-6m44w"] Mar 16 00:34:44 crc kubenswrapper[4816]: I0316 00:34:44.792442 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"runner\" (UniqueName: \"kubernetes.io/empty-dir/e96079bc-73ba-420e-9568-cea10077c4ae-runner\") pod \"smart-gateway-operator-55c8479bdf-6m44w\" (UID: \"e96079bc-73ba-420e-9568-cea10077c4ae\") " pod="service-telemetry/smart-gateway-operator-55c8479bdf-6m44w" Mar 16 00:34:44 crc kubenswrapper[4816]: I0316 00:34:44.792485 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wsdw4\" (UniqueName: \"kubernetes.io/projected/e96079bc-73ba-420e-9568-cea10077c4ae-kube-api-access-wsdw4\") pod \"smart-gateway-operator-55c8479bdf-6m44w\" (UID: \"e96079bc-73ba-420e-9568-cea10077c4ae\") " pod="service-telemetry/smart-gateway-operator-55c8479bdf-6m44w" Mar 16 00:34:44 crc kubenswrapper[4816]: I0316 00:34:44.894139 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"runner\" (UniqueName: \"kubernetes.io/empty-dir/e96079bc-73ba-420e-9568-cea10077c4ae-runner\") pod \"smart-gateway-operator-55c8479bdf-6m44w\" (UID: \"e96079bc-73ba-420e-9568-cea10077c4ae\") " pod="service-telemetry/smart-gateway-operator-55c8479bdf-6m44w" Mar 16 00:34:44 crc kubenswrapper[4816]: I0316 00:34:44.894207 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wsdw4\" (UniqueName: \"kubernetes.io/projected/e96079bc-73ba-420e-9568-cea10077c4ae-kube-api-access-wsdw4\") pod \"smart-gateway-operator-55c8479bdf-6m44w\" (UID: \"e96079bc-73ba-420e-9568-cea10077c4ae\") " pod="service-telemetry/smart-gateway-operator-55c8479bdf-6m44w" Mar 16 00:34:44 crc kubenswrapper[4816]: I0316 00:34:44.894715 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"runner\" (UniqueName: \"kubernetes.io/empty-dir/e96079bc-73ba-420e-9568-cea10077c4ae-runner\") pod \"smart-gateway-operator-55c8479bdf-6m44w\" (UID: \"e96079bc-73ba-420e-9568-cea10077c4ae\") " pod="service-telemetry/smart-gateway-operator-55c8479bdf-6m44w" Mar 16 00:34:44 crc kubenswrapper[4816]: I0316 00:34:44.919880 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wsdw4\" (UniqueName: \"kubernetes.io/projected/e96079bc-73ba-420e-9568-cea10077c4ae-kube-api-access-wsdw4\") pod \"smart-gateway-operator-55c8479bdf-6m44w\" (UID: \"e96079bc-73ba-420e-9568-cea10077c4ae\") " pod="service-telemetry/smart-gateway-operator-55c8479bdf-6m44w" Mar 16 00:34:45 crc kubenswrapper[4816]: I0316 00:34:45.071721 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/smart-gateway-operator-55c8479bdf-6m44w" Mar 16 00:34:45 crc kubenswrapper[4816]: I0316 00:34:45.561187 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/smart-gateway-operator-55c8479bdf-6m44w"] Mar 16 00:34:46 crc kubenswrapper[4816]: I0316 00:34:46.002542 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/smart-gateway-operator-55c8479bdf-6m44w" event={"ID":"e96079bc-73ba-420e-9568-cea10077c4ae","Type":"ContainerStarted","Data":"ae492dbdf77984277c138e17866872cb0ae074398da44666fcd372b9ab93eb20"} Mar 16 00:34:48 crc kubenswrapper[4816]: I0316 00:34:48.293767 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/service-telemetry-operator-7dbcddcc6f-lqrgz"] Mar 16 00:34:48 crc kubenswrapper[4816]: I0316 00:34:48.295827 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-operator-7dbcddcc6f-lqrgz" Mar 16 00:34:48 crc kubenswrapper[4816]: I0316 00:34:48.301433 4816 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"service-telemetry-operator-dockercfg-kx5vr" Mar 16 00:34:48 crc kubenswrapper[4816]: I0316 00:34:48.315323 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/service-telemetry-operator-7dbcddcc6f-lqrgz"] Mar 16 00:34:48 crc kubenswrapper[4816]: I0316 00:34:48.354960 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"runner\" (UniqueName: \"kubernetes.io/empty-dir/0faefde0-6740-414f-bf47-0d763a35b22f-runner\") pod \"service-telemetry-operator-7dbcddcc6f-lqrgz\" (UID: \"0faefde0-6740-414f-bf47-0d763a35b22f\") " pod="service-telemetry/service-telemetry-operator-7dbcddcc6f-lqrgz" Mar 16 00:34:48 crc kubenswrapper[4816]: I0316 00:34:48.355017 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b79gs\" (UniqueName: \"kubernetes.io/projected/0faefde0-6740-414f-bf47-0d763a35b22f-kube-api-access-b79gs\") pod \"service-telemetry-operator-7dbcddcc6f-lqrgz\" (UID: \"0faefde0-6740-414f-bf47-0d763a35b22f\") " pod="service-telemetry/service-telemetry-operator-7dbcddcc6f-lqrgz" Mar 16 00:34:48 crc kubenswrapper[4816]: I0316 00:34:48.455833 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"runner\" (UniqueName: \"kubernetes.io/empty-dir/0faefde0-6740-414f-bf47-0d763a35b22f-runner\") pod \"service-telemetry-operator-7dbcddcc6f-lqrgz\" (UID: \"0faefde0-6740-414f-bf47-0d763a35b22f\") " pod="service-telemetry/service-telemetry-operator-7dbcddcc6f-lqrgz" Mar 16 00:34:48 crc kubenswrapper[4816]: I0316 00:34:48.455911 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b79gs\" (UniqueName: \"kubernetes.io/projected/0faefde0-6740-414f-bf47-0d763a35b22f-kube-api-access-b79gs\") pod \"service-telemetry-operator-7dbcddcc6f-lqrgz\" (UID: \"0faefde0-6740-414f-bf47-0d763a35b22f\") " pod="service-telemetry/service-telemetry-operator-7dbcddcc6f-lqrgz" Mar 16 00:34:48 crc kubenswrapper[4816]: I0316 00:34:48.456347 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"runner\" (UniqueName: \"kubernetes.io/empty-dir/0faefde0-6740-414f-bf47-0d763a35b22f-runner\") pod \"service-telemetry-operator-7dbcddcc6f-lqrgz\" (UID: \"0faefde0-6740-414f-bf47-0d763a35b22f\") " pod="service-telemetry/service-telemetry-operator-7dbcddcc6f-lqrgz" Mar 16 00:34:48 crc kubenswrapper[4816]: I0316 00:34:48.477117 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b79gs\" (UniqueName: \"kubernetes.io/projected/0faefde0-6740-414f-bf47-0d763a35b22f-kube-api-access-b79gs\") pod \"service-telemetry-operator-7dbcddcc6f-lqrgz\" (UID: \"0faefde0-6740-414f-bf47-0d763a35b22f\") " pod="service-telemetry/service-telemetry-operator-7dbcddcc6f-lqrgz" Mar 16 00:34:48 crc kubenswrapper[4816]: I0316 00:34:48.622872 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-operator-7dbcddcc6f-lqrgz" Mar 16 00:34:57 crc kubenswrapper[4816]: I0316 00:34:57.843958 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/service-telemetry-operator-7dbcddcc6f-lqrgz"] Mar 16 00:34:58 crc kubenswrapper[4816]: I0316 00:34:58.088015 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-7dbcddcc6f-lqrgz" event={"ID":"0faefde0-6740-414f-bf47-0d763a35b22f","Type":"ContainerStarted","Data":"7ae4409471dad3fe81d11dad7876cb9dadb1f3b34b935cca3cb522c0fab14ad9"} Mar 16 00:35:01 crc kubenswrapper[4816]: E0316 00:35:01.034611 4816 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/infrawatch/smart-gateway-operator:latest" Mar 16 00:35:01 crc kubenswrapper[4816]: E0316 00:35:01.035077 4816 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:operator,Image:quay.io/infrawatch/smart-gateway-operator:latest,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:WATCH_NAMESPACE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.annotations['olm.targetNamespaces'],},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:OPERATOR_NAME,Value:smart-gateway-operator,ValueFrom:nil,},EnvVar{Name:ANSIBLE_GATHERING,Value:explicit,ValueFrom:nil,},EnvVar{Name:ANSIBLE_VERBOSITY_SMARTGATEWAY_SMARTGATEWAY_INFRA_WATCH,Value:4,ValueFrom:nil,},EnvVar{Name:ANSIBLE_DEBUG_LOGS,Value:true,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CORE_SMARTGATEWAY_IMAGE,Value:image-registry.openshift-image-registry.svc:5000/service-telemetry/sg-core:latest,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_BRIDGE_SMARTGATEWAY_IMAGE,Value:image-registry.openshift-image-registry.svc:5000/service-telemetry/sg-bridge:latest,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OAUTH_PROXY_IMAGE,Value:quay.io/openshift/origin-oauth-proxy:latest,ValueFrom:nil,},EnvVar{Name:OPERATOR_CONDITION_NAME,Value:smart-gateway-operator.v5.0.1773621141,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:runner,ReadOnly:false,MountPath:/tmp/ansible-operator/runner,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-wsdw4,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000670000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod smart-gateway-operator-55c8479bdf-6m44w_service-telemetry(e96079bc-73ba-420e-9568-cea10077c4ae): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 16 00:35:01 crc kubenswrapper[4816]: E0316 00:35:01.036253 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="service-telemetry/smart-gateway-operator-55c8479bdf-6m44w" podUID="e96079bc-73ba-420e-9568-cea10077c4ae" Mar 16 00:35:01 crc kubenswrapper[4816]: E0316 00:35:01.105902 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/infrawatch/smart-gateway-operator:latest\\\"\"" pod="service-telemetry/smart-gateway-operator-55c8479bdf-6m44w" podUID="e96079bc-73ba-420e-9568-cea10077c4ae" Mar 16 00:35:04 crc kubenswrapper[4816]: I0316 00:35:04.528898 4816 scope.go:117] "RemoveContainer" containerID="2184a6c7d5ea889f0c49670caabfc30e2cdb52bf2b9beb7864557d83b84bbb54" Mar 16 00:35:06 crc kubenswrapper[4816]: I0316 00:35:06.139298 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-7dbcddcc6f-lqrgz" event={"ID":"0faefde0-6740-414f-bf47-0d763a35b22f","Type":"ContainerStarted","Data":"5613cc1e49e0a61f32d364bd129d564af21046d3694be5c16aee6856f727d100"} Mar 16 00:35:06 crc kubenswrapper[4816]: I0316 00:35:06.162835 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/service-telemetry-operator-7dbcddcc6f-lqrgz" podStartSLOduration=10.647741651 podStartE2EDuration="18.162807307s" podCreationTimestamp="2026-03-16 00:34:48 +0000 UTC" firstStartedPulling="2026-03-16 00:34:57.888322692 +0000 UTC m=+1690.984622645" lastFinishedPulling="2026-03-16 00:35:05.403388308 +0000 UTC m=+1698.499688301" observedRunningTime="2026-03-16 00:35:06.16147027 +0000 UTC m=+1699.257770263" watchObservedRunningTime="2026-03-16 00:35:06.162807307 +0000 UTC m=+1699.259107300" Mar 16 00:35:17 crc kubenswrapper[4816]: I0316 00:35:17.228368 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/smart-gateway-operator-55c8479bdf-6m44w" event={"ID":"e96079bc-73ba-420e-9568-cea10077c4ae","Type":"ContainerStarted","Data":"9ca787bd1ff2bd1d864f8b0766658d09f0733ad33d64614af7275cf5f52a93a0"} Mar 16 00:35:17 crc kubenswrapper[4816]: I0316 00:35:17.247329 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/smart-gateway-operator-55c8479bdf-6m44w" podStartSLOduration=2.631006352 podStartE2EDuration="33.247305263s" podCreationTimestamp="2026-03-16 00:34:44 +0000 UTC" firstStartedPulling="2026-03-16 00:34:45.56848991 +0000 UTC m=+1678.664789863" lastFinishedPulling="2026-03-16 00:35:16.184788821 +0000 UTC m=+1709.281088774" observedRunningTime="2026-03-16 00:35:17.245241285 +0000 UTC m=+1710.341541238" watchObservedRunningTime="2026-03-16 00:35:17.247305263 +0000 UTC m=+1710.343605226" Mar 16 00:35:30 crc kubenswrapper[4816]: I0316 00:35:30.916878 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/default-interconnect-68864d46cb-zdlgx"] Mar 16 00:35:30 crc kubenswrapper[4816]: I0316 00:35:30.918087 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/default-interconnect-68864d46cb-zdlgx" Mar 16 00:35:30 crc kubenswrapper[4816]: I0316 00:35:30.921330 4816 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"default-interconnect-inter-router-credentials" Mar 16 00:35:30 crc kubenswrapper[4816]: I0316 00:35:30.921417 4816 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"default-interconnect-dockercfg-j42x2" Mar 16 00:35:30 crc kubenswrapper[4816]: I0316 00:35:30.921442 4816 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"default-interconnect-openstack-ca" Mar 16 00:35:30 crc kubenswrapper[4816]: I0316 00:35:30.922033 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"default-interconnect-sasl-config" Mar 16 00:35:30 crc kubenswrapper[4816]: I0316 00:35:30.922036 4816 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"default-interconnect-inter-router-ca" Mar 16 00:35:30 crc kubenswrapper[4816]: I0316 00:35:30.923548 4816 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"default-interconnect-users" Mar 16 00:35:30 crc kubenswrapper[4816]: I0316 00:35:30.923953 4816 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"default-interconnect-openstack-credentials" Mar 16 00:35:30 crc kubenswrapper[4816]: I0316 00:35:30.943394 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/default-interconnect-68864d46cb-zdlgx"] Mar 16 00:35:31 crc kubenswrapper[4816]: I0316 00:35:31.041094 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-interconnect-inter-router-credentials\" (UniqueName: \"kubernetes.io/secret/91573536-f8d4-475f-bfb6-dd2ad9910ce0-default-interconnect-inter-router-credentials\") pod \"default-interconnect-68864d46cb-zdlgx\" (UID: \"91573536-f8d4-475f-bfb6-dd2ad9910ce0\") " pod="service-telemetry/default-interconnect-68864d46cb-zdlgx" Mar 16 00:35:31 crc kubenswrapper[4816]: I0316 00:35:31.041166 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-interconnect-openstack-ca\" (UniqueName: \"kubernetes.io/secret/91573536-f8d4-475f-bfb6-dd2ad9910ce0-default-interconnect-openstack-ca\") pod \"default-interconnect-68864d46cb-zdlgx\" (UID: \"91573536-f8d4-475f-bfb6-dd2ad9910ce0\") " pod="service-telemetry/default-interconnect-68864d46cb-zdlgx" Mar 16 00:35:31 crc kubenswrapper[4816]: I0316 00:35:31.041199 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-interconnect-openstack-credentials\" (UniqueName: \"kubernetes.io/secret/91573536-f8d4-475f-bfb6-dd2ad9910ce0-default-interconnect-openstack-credentials\") pod \"default-interconnect-68864d46cb-zdlgx\" (UID: \"91573536-f8d4-475f-bfb6-dd2ad9910ce0\") " pod="service-telemetry/default-interconnect-68864d46cb-zdlgx" Mar 16 00:35:31 crc kubenswrapper[4816]: I0316 00:35:31.041216 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sasl-config\" (UniqueName: \"kubernetes.io/configmap/91573536-f8d4-475f-bfb6-dd2ad9910ce0-sasl-config\") pod \"default-interconnect-68864d46cb-zdlgx\" (UID: \"91573536-f8d4-475f-bfb6-dd2ad9910ce0\") " pod="service-telemetry/default-interconnect-68864d46cb-zdlgx" Mar 16 00:35:31 crc kubenswrapper[4816]: I0316 00:35:31.041249 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sasl-users\" (UniqueName: \"kubernetes.io/secret/91573536-f8d4-475f-bfb6-dd2ad9910ce0-sasl-users\") pod \"default-interconnect-68864d46cb-zdlgx\" (UID: \"91573536-f8d4-475f-bfb6-dd2ad9910ce0\") " pod="service-telemetry/default-interconnect-68864d46cb-zdlgx" Mar 16 00:35:31 crc kubenswrapper[4816]: I0316 00:35:31.041391 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-interconnect-inter-router-ca\" (UniqueName: \"kubernetes.io/secret/91573536-f8d4-475f-bfb6-dd2ad9910ce0-default-interconnect-inter-router-ca\") pod \"default-interconnect-68864d46cb-zdlgx\" (UID: \"91573536-f8d4-475f-bfb6-dd2ad9910ce0\") " pod="service-telemetry/default-interconnect-68864d46cb-zdlgx" Mar 16 00:35:31 crc kubenswrapper[4816]: I0316 00:35:31.041462 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mth5q\" (UniqueName: \"kubernetes.io/projected/91573536-f8d4-475f-bfb6-dd2ad9910ce0-kube-api-access-mth5q\") pod \"default-interconnect-68864d46cb-zdlgx\" (UID: \"91573536-f8d4-475f-bfb6-dd2ad9910ce0\") " pod="service-telemetry/default-interconnect-68864d46cb-zdlgx" Mar 16 00:35:31 crc kubenswrapper[4816]: I0316 00:35:31.144246 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mth5q\" (UniqueName: \"kubernetes.io/projected/91573536-f8d4-475f-bfb6-dd2ad9910ce0-kube-api-access-mth5q\") pod \"default-interconnect-68864d46cb-zdlgx\" (UID: \"91573536-f8d4-475f-bfb6-dd2ad9910ce0\") " pod="service-telemetry/default-interconnect-68864d46cb-zdlgx" Mar 16 00:35:31 crc kubenswrapper[4816]: I0316 00:35:31.144326 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-interconnect-inter-router-credentials\" (UniqueName: \"kubernetes.io/secret/91573536-f8d4-475f-bfb6-dd2ad9910ce0-default-interconnect-inter-router-credentials\") pod \"default-interconnect-68864d46cb-zdlgx\" (UID: \"91573536-f8d4-475f-bfb6-dd2ad9910ce0\") " pod="service-telemetry/default-interconnect-68864d46cb-zdlgx" Mar 16 00:35:31 crc kubenswrapper[4816]: I0316 00:35:31.144357 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-interconnect-openstack-ca\" (UniqueName: \"kubernetes.io/secret/91573536-f8d4-475f-bfb6-dd2ad9910ce0-default-interconnect-openstack-ca\") pod \"default-interconnect-68864d46cb-zdlgx\" (UID: \"91573536-f8d4-475f-bfb6-dd2ad9910ce0\") " pod="service-telemetry/default-interconnect-68864d46cb-zdlgx" Mar 16 00:35:31 crc kubenswrapper[4816]: I0316 00:35:31.144381 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-interconnect-openstack-credentials\" (UniqueName: \"kubernetes.io/secret/91573536-f8d4-475f-bfb6-dd2ad9910ce0-default-interconnect-openstack-credentials\") pod \"default-interconnect-68864d46cb-zdlgx\" (UID: \"91573536-f8d4-475f-bfb6-dd2ad9910ce0\") " pod="service-telemetry/default-interconnect-68864d46cb-zdlgx" Mar 16 00:35:31 crc kubenswrapper[4816]: I0316 00:35:31.144397 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sasl-config\" (UniqueName: \"kubernetes.io/configmap/91573536-f8d4-475f-bfb6-dd2ad9910ce0-sasl-config\") pod \"default-interconnect-68864d46cb-zdlgx\" (UID: \"91573536-f8d4-475f-bfb6-dd2ad9910ce0\") " pod="service-telemetry/default-interconnect-68864d46cb-zdlgx" Mar 16 00:35:31 crc kubenswrapper[4816]: I0316 00:35:31.144436 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sasl-users\" (UniqueName: \"kubernetes.io/secret/91573536-f8d4-475f-bfb6-dd2ad9910ce0-sasl-users\") pod \"default-interconnect-68864d46cb-zdlgx\" (UID: \"91573536-f8d4-475f-bfb6-dd2ad9910ce0\") " pod="service-telemetry/default-interconnect-68864d46cb-zdlgx" Mar 16 00:35:31 crc kubenswrapper[4816]: I0316 00:35:31.144456 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-interconnect-inter-router-ca\" (UniqueName: \"kubernetes.io/secret/91573536-f8d4-475f-bfb6-dd2ad9910ce0-default-interconnect-inter-router-ca\") pod \"default-interconnect-68864d46cb-zdlgx\" (UID: \"91573536-f8d4-475f-bfb6-dd2ad9910ce0\") " pod="service-telemetry/default-interconnect-68864d46cb-zdlgx" Mar 16 00:35:31 crc kubenswrapper[4816]: I0316 00:35:31.145801 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sasl-config\" (UniqueName: \"kubernetes.io/configmap/91573536-f8d4-475f-bfb6-dd2ad9910ce0-sasl-config\") pod \"default-interconnect-68864d46cb-zdlgx\" (UID: \"91573536-f8d4-475f-bfb6-dd2ad9910ce0\") " pod="service-telemetry/default-interconnect-68864d46cb-zdlgx" Mar 16 00:35:31 crc kubenswrapper[4816]: I0316 00:35:31.153963 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-interconnect-inter-router-credentials\" (UniqueName: \"kubernetes.io/secret/91573536-f8d4-475f-bfb6-dd2ad9910ce0-default-interconnect-inter-router-credentials\") pod \"default-interconnect-68864d46cb-zdlgx\" (UID: \"91573536-f8d4-475f-bfb6-dd2ad9910ce0\") " pod="service-telemetry/default-interconnect-68864d46cb-zdlgx" Mar 16 00:35:31 crc kubenswrapper[4816]: I0316 00:35:31.154195 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-interconnect-inter-router-ca\" (UniqueName: \"kubernetes.io/secret/91573536-f8d4-475f-bfb6-dd2ad9910ce0-default-interconnect-inter-router-ca\") pod \"default-interconnect-68864d46cb-zdlgx\" (UID: \"91573536-f8d4-475f-bfb6-dd2ad9910ce0\") " pod="service-telemetry/default-interconnect-68864d46cb-zdlgx" Mar 16 00:35:31 crc kubenswrapper[4816]: I0316 00:35:31.168142 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-interconnect-openstack-credentials\" (UniqueName: \"kubernetes.io/secret/91573536-f8d4-475f-bfb6-dd2ad9910ce0-default-interconnect-openstack-credentials\") pod \"default-interconnect-68864d46cb-zdlgx\" (UID: \"91573536-f8d4-475f-bfb6-dd2ad9910ce0\") " pod="service-telemetry/default-interconnect-68864d46cb-zdlgx" Mar 16 00:35:31 crc kubenswrapper[4816]: I0316 00:35:31.169249 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mth5q\" (UniqueName: \"kubernetes.io/projected/91573536-f8d4-475f-bfb6-dd2ad9910ce0-kube-api-access-mth5q\") pod \"default-interconnect-68864d46cb-zdlgx\" (UID: \"91573536-f8d4-475f-bfb6-dd2ad9910ce0\") " pod="service-telemetry/default-interconnect-68864d46cb-zdlgx" Mar 16 00:35:31 crc kubenswrapper[4816]: I0316 00:35:31.170360 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sasl-users\" (UniqueName: \"kubernetes.io/secret/91573536-f8d4-475f-bfb6-dd2ad9910ce0-sasl-users\") pod \"default-interconnect-68864d46cb-zdlgx\" (UID: \"91573536-f8d4-475f-bfb6-dd2ad9910ce0\") " pod="service-telemetry/default-interconnect-68864d46cb-zdlgx" Mar 16 00:35:31 crc kubenswrapper[4816]: I0316 00:35:31.184460 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-interconnect-openstack-ca\" (UniqueName: \"kubernetes.io/secret/91573536-f8d4-475f-bfb6-dd2ad9910ce0-default-interconnect-openstack-ca\") pod \"default-interconnect-68864d46cb-zdlgx\" (UID: \"91573536-f8d4-475f-bfb6-dd2ad9910ce0\") " pod="service-telemetry/default-interconnect-68864d46cb-zdlgx" Mar 16 00:35:31 crc kubenswrapper[4816]: I0316 00:35:31.231895 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/default-interconnect-68864d46cb-zdlgx" Mar 16 00:35:31 crc kubenswrapper[4816]: I0316 00:35:31.633501 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/default-interconnect-68864d46cb-zdlgx"] Mar 16 00:35:31 crc kubenswrapper[4816]: W0316 00:35:31.638260 4816 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod91573536_f8d4_475f_bfb6_dd2ad9910ce0.slice/crio-26dc3f4ee67e3496e2e4a3275b5368fe7536b3be135627eecf62f5d01c3d1d56 WatchSource:0}: Error finding container 26dc3f4ee67e3496e2e4a3275b5368fe7536b3be135627eecf62f5d01c3d1d56: Status 404 returned error can't find the container with id 26dc3f4ee67e3496e2e4a3275b5368fe7536b3be135627eecf62f5d01c3d1d56 Mar 16 00:35:31 crc kubenswrapper[4816]: I0316 00:35:31.863981 4816 patch_prober.go:28] interesting pod/machine-config-daemon-jrdcz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 16 00:35:31 crc kubenswrapper[4816]: I0316 00:35:31.864059 4816 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jrdcz" podUID="dd08ece2-7636-4966-973a-e96a34b70b53" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 16 00:35:32 crc kubenswrapper[4816]: I0316 00:35:32.341795 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-interconnect-68864d46cb-zdlgx" event={"ID":"91573536-f8d4-475f-bfb6-dd2ad9910ce0","Type":"ContainerStarted","Data":"26dc3f4ee67e3496e2e4a3275b5368fe7536b3be135627eecf62f5d01c3d1d56"} Mar 16 00:35:38 crc kubenswrapper[4816]: I0316 00:35:38.388326 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-interconnect-68864d46cb-zdlgx" event={"ID":"91573536-f8d4-475f-bfb6-dd2ad9910ce0","Type":"ContainerStarted","Data":"b845ba99f7acb2f04acbd6a1b07af235c4d1d04bb6986788b5b33868b6ffa6d3"} Mar 16 00:35:41 crc kubenswrapper[4816]: I0316 00:35:41.078085 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/default-interconnect-68864d46cb-zdlgx" podStartSLOduration=5.421713619 podStartE2EDuration="11.078060946s" podCreationTimestamp="2026-03-16 00:35:30 +0000 UTC" firstStartedPulling="2026-03-16 00:35:31.639796232 +0000 UTC m=+1724.736096185" lastFinishedPulling="2026-03-16 00:35:37.296143519 +0000 UTC m=+1730.392443512" observedRunningTime="2026-03-16 00:35:38.434086869 +0000 UTC m=+1731.530386832" watchObservedRunningTime="2026-03-16 00:35:41.078060946 +0000 UTC m=+1734.174360899" Mar 16 00:35:41 crc kubenswrapper[4816]: I0316 00:35:41.083049 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/prometheus-default-0"] Mar 16 00:35:41 crc kubenswrapper[4816]: I0316 00:35:41.084457 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/prometheus-default-0" Mar 16 00:35:41 crc kubenswrapper[4816]: I0316 00:35:41.087622 4816 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"prometheus-default" Mar 16 00:35:41 crc kubenswrapper[4816]: I0316 00:35:41.087628 4816 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"default-session-secret" Mar 16 00:35:41 crc kubenswrapper[4816]: I0316 00:35:41.087629 4816 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"prometheus-default-web-config" Mar 16 00:35:41 crc kubenswrapper[4816]: I0316 00:35:41.087623 4816 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"default-prometheus-proxy-tls" Mar 16 00:35:41 crc kubenswrapper[4816]: I0316 00:35:41.087794 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"prometheus-default-rulefiles-2" Mar 16 00:35:41 crc kubenswrapper[4816]: I0316 00:35:41.087661 4816 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"prometheus-default-tls-assets-0" Mar 16 00:35:41 crc kubenswrapper[4816]: I0316 00:35:41.087759 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"prometheus-default-rulefiles-0" Mar 16 00:35:41 crc kubenswrapper[4816]: I0316 00:35:41.088148 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"serving-certs-ca-bundle" Mar 16 00:35:41 crc kubenswrapper[4816]: I0316 00:35:41.088261 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"prometheus-default-rulefiles-1" Mar 16 00:35:41 crc kubenswrapper[4816]: I0316 00:35:41.088370 4816 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"prometheus-stf-dockercfg-hcjdg" Mar 16 00:35:41 crc kubenswrapper[4816]: I0316 00:35:41.094619 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/prometheus-default-0"] Mar 16 00:35:41 crc kubenswrapper[4816]: I0316 00:35:41.184688 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/078376fd-a0f8-4157-8a07-23ce85695dc6-web-config\") pod \"prometheus-default-0\" (UID: \"078376fd-a0f8-4157-8a07-23ce85695dc6\") " pod="service-telemetry/prometheus-default-0" Mar 16 00:35:41 crc kubenswrapper[4816]: I0316 00:35:41.184720 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-default-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/078376fd-a0f8-4157-8a07-23ce85695dc6-prometheus-default-rulefiles-0\") pod \"prometheus-default-0\" (UID: \"078376fd-a0f8-4157-8a07-23ce85695dc6\") " pod="service-telemetry/prometheus-default-0" Mar 16 00:35:41 crc kubenswrapper[4816]: I0316 00:35:41.184780 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/078376fd-a0f8-4157-8a07-23ce85695dc6-config-out\") pod \"prometheus-default-0\" (UID: \"078376fd-a0f8-4157-8a07-23ce85695dc6\") " pod="service-telemetry/prometheus-default-0" Mar 16 00:35:41 crc kubenswrapper[4816]: I0316 00:35:41.184826 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r9ldj\" (UniqueName: \"kubernetes.io/projected/078376fd-a0f8-4157-8a07-23ce85695dc6-kube-api-access-r9ldj\") pod \"prometheus-default-0\" (UID: \"078376fd-a0f8-4157-8a07-23ce85695dc6\") " pod="service-telemetry/prometheus-default-0" Mar 16 00:35:41 crc kubenswrapper[4816]: I0316 00:35:41.184844 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-default-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/078376fd-a0f8-4157-8a07-23ce85695dc6-prometheus-default-rulefiles-2\") pod \"prometheus-default-0\" (UID: \"078376fd-a0f8-4157-8a07-23ce85695dc6\") " pod="service-telemetry/prometheus-default-0" Mar 16 00:35:41 crc kubenswrapper[4816]: I0316 00:35:41.184858 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/078376fd-a0f8-4157-8a07-23ce85695dc6-config\") pod \"prometheus-default-0\" (UID: \"078376fd-a0f8-4157-8a07-23ce85695dc6\") " pod="service-telemetry/prometheus-default-0" Mar 16 00:35:41 crc kubenswrapper[4816]: I0316 00:35:41.184907 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-default-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/078376fd-a0f8-4157-8a07-23ce85695dc6-prometheus-default-rulefiles-1\") pod \"prometheus-default-0\" (UID: \"078376fd-a0f8-4157-8a07-23ce85695dc6\") " pod="service-telemetry/prometheus-default-0" Mar 16 00:35:41 crc kubenswrapper[4816]: I0316 00:35:41.184927 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-default-prometheus-proxy-tls\" (UniqueName: \"kubernetes.io/secret/078376fd-a0f8-4157-8a07-23ce85695dc6-secret-default-prometheus-proxy-tls\") pod \"prometheus-default-0\" (UID: \"078376fd-a0f8-4157-8a07-23ce85695dc6\") " pod="service-telemetry/prometheus-default-0" Mar 16 00:35:41 crc kubenswrapper[4816]: I0316 00:35:41.184991 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-b5feeac4-4007-401f-b65b-dde9f6400e27\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b5feeac4-4007-401f-b65b-dde9f6400e27\") pod \"prometheus-default-0\" (UID: \"078376fd-a0f8-4157-8a07-23ce85695dc6\") " pod="service-telemetry/prometheus-default-0" Mar 16 00:35:41 crc kubenswrapper[4816]: I0316 00:35:41.185040 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/078376fd-a0f8-4157-8a07-23ce85695dc6-tls-assets\") pod \"prometheus-default-0\" (UID: \"078376fd-a0f8-4157-8a07-23ce85695dc6\") " pod="service-telemetry/prometheus-default-0" Mar 16 00:35:41 crc kubenswrapper[4816]: I0316 00:35:41.185057 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/078376fd-a0f8-4157-8a07-23ce85695dc6-configmap-serving-certs-ca-bundle\") pod \"prometheus-default-0\" (UID: \"078376fd-a0f8-4157-8a07-23ce85695dc6\") " pod="service-telemetry/prometheus-default-0" Mar 16 00:35:41 crc kubenswrapper[4816]: I0316 00:35:41.185233 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-default-session-secret\" (UniqueName: \"kubernetes.io/secret/078376fd-a0f8-4157-8a07-23ce85695dc6-secret-default-session-secret\") pod \"prometheus-default-0\" (UID: \"078376fd-a0f8-4157-8a07-23ce85695dc6\") " pod="service-telemetry/prometheus-default-0" Mar 16 00:35:41 crc kubenswrapper[4816]: I0316 00:35:41.286257 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-default-session-secret\" (UniqueName: \"kubernetes.io/secret/078376fd-a0f8-4157-8a07-23ce85695dc6-secret-default-session-secret\") pod \"prometheus-default-0\" (UID: \"078376fd-a0f8-4157-8a07-23ce85695dc6\") " pod="service-telemetry/prometheus-default-0" Mar 16 00:35:41 crc kubenswrapper[4816]: I0316 00:35:41.286316 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/078376fd-a0f8-4157-8a07-23ce85695dc6-web-config\") pod \"prometheus-default-0\" (UID: \"078376fd-a0f8-4157-8a07-23ce85695dc6\") " pod="service-telemetry/prometheus-default-0" Mar 16 00:35:41 crc kubenswrapper[4816]: I0316 00:35:41.286333 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-default-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/078376fd-a0f8-4157-8a07-23ce85695dc6-prometheus-default-rulefiles-0\") pod \"prometheus-default-0\" (UID: \"078376fd-a0f8-4157-8a07-23ce85695dc6\") " pod="service-telemetry/prometheus-default-0" Mar 16 00:35:41 crc kubenswrapper[4816]: I0316 00:35:41.286354 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/078376fd-a0f8-4157-8a07-23ce85695dc6-config-out\") pod \"prometheus-default-0\" (UID: \"078376fd-a0f8-4157-8a07-23ce85695dc6\") " pod="service-telemetry/prometheus-default-0" Mar 16 00:35:41 crc kubenswrapper[4816]: I0316 00:35:41.286382 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r9ldj\" (UniqueName: \"kubernetes.io/projected/078376fd-a0f8-4157-8a07-23ce85695dc6-kube-api-access-r9ldj\") pod \"prometheus-default-0\" (UID: \"078376fd-a0f8-4157-8a07-23ce85695dc6\") " pod="service-telemetry/prometheus-default-0" Mar 16 00:35:41 crc kubenswrapper[4816]: I0316 00:35:41.286403 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-default-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/078376fd-a0f8-4157-8a07-23ce85695dc6-prometheus-default-rulefiles-2\") pod \"prometheus-default-0\" (UID: \"078376fd-a0f8-4157-8a07-23ce85695dc6\") " pod="service-telemetry/prometheus-default-0" Mar 16 00:35:41 crc kubenswrapper[4816]: I0316 00:35:41.286420 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/078376fd-a0f8-4157-8a07-23ce85695dc6-config\") pod \"prometheus-default-0\" (UID: \"078376fd-a0f8-4157-8a07-23ce85695dc6\") " pod="service-telemetry/prometheus-default-0" Mar 16 00:35:41 crc kubenswrapper[4816]: I0316 00:35:41.286443 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-default-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/078376fd-a0f8-4157-8a07-23ce85695dc6-prometheus-default-rulefiles-1\") pod \"prometheus-default-0\" (UID: \"078376fd-a0f8-4157-8a07-23ce85695dc6\") " pod="service-telemetry/prometheus-default-0" Mar 16 00:35:41 crc kubenswrapper[4816]: I0316 00:35:41.286464 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-default-prometheus-proxy-tls\" (UniqueName: \"kubernetes.io/secret/078376fd-a0f8-4157-8a07-23ce85695dc6-secret-default-prometheus-proxy-tls\") pod \"prometheus-default-0\" (UID: \"078376fd-a0f8-4157-8a07-23ce85695dc6\") " pod="service-telemetry/prometheus-default-0" Mar 16 00:35:41 crc kubenswrapper[4816]: I0316 00:35:41.286493 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-b5feeac4-4007-401f-b65b-dde9f6400e27\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b5feeac4-4007-401f-b65b-dde9f6400e27\") pod \"prometheus-default-0\" (UID: \"078376fd-a0f8-4157-8a07-23ce85695dc6\") " pod="service-telemetry/prometheus-default-0" Mar 16 00:35:41 crc kubenswrapper[4816]: I0316 00:35:41.286520 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/078376fd-a0f8-4157-8a07-23ce85695dc6-tls-assets\") pod \"prometheus-default-0\" (UID: \"078376fd-a0f8-4157-8a07-23ce85695dc6\") " pod="service-telemetry/prometheus-default-0" Mar 16 00:35:41 crc kubenswrapper[4816]: I0316 00:35:41.286538 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/078376fd-a0f8-4157-8a07-23ce85695dc6-configmap-serving-certs-ca-bundle\") pod \"prometheus-default-0\" (UID: \"078376fd-a0f8-4157-8a07-23ce85695dc6\") " pod="service-telemetry/prometheus-default-0" Mar 16 00:35:41 crc kubenswrapper[4816]: E0316 00:35:41.286744 4816 secret.go:188] Couldn't get secret service-telemetry/default-prometheus-proxy-tls: secret "default-prometheus-proxy-tls" not found Mar 16 00:35:41 crc kubenswrapper[4816]: E0316 00:35:41.286815 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/078376fd-a0f8-4157-8a07-23ce85695dc6-secret-default-prometheus-proxy-tls podName:078376fd-a0f8-4157-8a07-23ce85695dc6 nodeName:}" failed. No retries permitted until 2026-03-16 00:35:41.786794936 +0000 UTC m=+1734.883094889 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "secret-default-prometheus-proxy-tls" (UniqueName: "kubernetes.io/secret/078376fd-a0f8-4157-8a07-23ce85695dc6-secret-default-prometheus-proxy-tls") pod "prometheus-default-0" (UID: "078376fd-a0f8-4157-8a07-23ce85695dc6") : secret "default-prometheus-proxy-tls" not found Mar 16 00:35:41 crc kubenswrapper[4816]: I0316 00:35:41.287285 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-default-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/078376fd-a0f8-4157-8a07-23ce85695dc6-prometheus-default-rulefiles-0\") pod \"prometheus-default-0\" (UID: \"078376fd-a0f8-4157-8a07-23ce85695dc6\") " pod="service-telemetry/prometheus-default-0" Mar 16 00:35:41 crc kubenswrapper[4816]: I0316 00:35:41.287454 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-default-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/078376fd-a0f8-4157-8a07-23ce85695dc6-prometheus-default-rulefiles-1\") pod \"prometheus-default-0\" (UID: \"078376fd-a0f8-4157-8a07-23ce85695dc6\") " pod="service-telemetry/prometheus-default-0" Mar 16 00:35:41 crc kubenswrapper[4816]: I0316 00:35:41.287515 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/078376fd-a0f8-4157-8a07-23ce85695dc6-configmap-serving-certs-ca-bundle\") pod \"prometheus-default-0\" (UID: \"078376fd-a0f8-4157-8a07-23ce85695dc6\") " pod="service-telemetry/prometheus-default-0" Mar 16 00:35:41 crc kubenswrapper[4816]: I0316 00:35:41.288127 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-default-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/078376fd-a0f8-4157-8a07-23ce85695dc6-prometheus-default-rulefiles-2\") pod \"prometheus-default-0\" (UID: \"078376fd-a0f8-4157-8a07-23ce85695dc6\") " pod="service-telemetry/prometheus-default-0" Mar 16 00:35:41 crc kubenswrapper[4816]: I0316 00:35:41.289819 4816 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 16 00:35:41 crc kubenswrapper[4816]: I0316 00:35:41.289844 4816 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-b5feeac4-4007-401f-b65b-dde9f6400e27\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b5feeac4-4007-401f-b65b-dde9f6400e27\") pod \"prometheus-default-0\" (UID: \"078376fd-a0f8-4157-8a07-23ce85695dc6\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/9f03051eddd5b0320ee2397f728e71018fe21f91cbc5c5dd1c3d97248c518ba7/globalmount\"" pod="service-telemetry/prometheus-default-0" Mar 16 00:35:41 crc kubenswrapper[4816]: I0316 00:35:41.292143 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/078376fd-a0f8-4157-8a07-23ce85695dc6-tls-assets\") pod \"prometheus-default-0\" (UID: \"078376fd-a0f8-4157-8a07-23ce85695dc6\") " pod="service-telemetry/prometheus-default-0" Mar 16 00:35:41 crc kubenswrapper[4816]: I0316 00:35:41.292160 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-default-session-secret\" (UniqueName: \"kubernetes.io/secret/078376fd-a0f8-4157-8a07-23ce85695dc6-secret-default-session-secret\") pod \"prometheus-default-0\" (UID: \"078376fd-a0f8-4157-8a07-23ce85695dc6\") " pod="service-telemetry/prometheus-default-0" Mar 16 00:35:41 crc kubenswrapper[4816]: I0316 00:35:41.293657 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/078376fd-a0f8-4157-8a07-23ce85695dc6-config-out\") pod \"prometheus-default-0\" (UID: \"078376fd-a0f8-4157-8a07-23ce85695dc6\") " pod="service-telemetry/prometheus-default-0" Mar 16 00:35:41 crc kubenswrapper[4816]: I0316 00:35:41.297987 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/078376fd-a0f8-4157-8a07-23ce85695dc6-config\") pod \"prometheus-default-0\" (UID: \"078376fd-a0f8-4157-8a07-23ce85695dc6\") " pod="service-telemetry/prometheus-default-0" Mar 16 00:35:41 crc kubenswrapper[4816]: I0316 00:35:41.305296 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r9ldj\" (UniqueName: \"kubernetes.io/projected/078376fd-a0f8-4157-8a07-23ce85695dc6-kube-api-access-r9ldj\") pod \"prometheus-default-0\" (UID: \"078376fd-a0f8-4157-8a07-23ce85695dc6\") " pod="service-telemetry/prometheus-default-0" Mar 16 00:35:41 crc kubenswrapper[4816]: I0316 00:35:41.310976 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-b5feeac4-4007-401f-b65b-dde9f6400e27\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b5feeac4-4007-401f-b65b-dde9f6400e27\") pod \"prometheus-default-0\" (UID: \"078376fd-a0f8-4157-8a07-23ce85695dc6\") " pod="service-telemetry/prometheus-default-0" Mar 16 00:35:41 crc kubenswrapper[4816]: I0316 00:35:41.312091 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/078376fd-a0f8-4157-8a07-23ce85695dc6-web-config\") pod \"prometheus-default-0\" (UID: \"078376fd-a0f8-4157-8a07-23ce85695dc6\") " pod="service-telemetry/prometheus-default-0" Mar 16 00:35:41 crc kubenswrapper[4816]: I0316 00:35:41.792535 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-default-prometheus-proxy-tls\" (UniqueName: \"kubernetes.io/secret/078376fd-a0f8-4157-8a07-23ce85695dc6-secret-default-prometheus-proxy-tls\") pod \"prometheus-default-0\" (UID: \"078376fd-a0f8-4157-8a07-23ce85695dc6\") " pod="service-telemetry/prometheus-default-0" Mar 16 00:35:41 crc kubenswrapper[4816]: E0316 00:35:41.792763 4816 secret.go:188] Couldn't get secret service-telemetry/default-prometheus-proxy-tls: secret "default-prometheus-proxy-tls" not found Mar 16 00:35:41 crc kubenswrapper[4816]: E0316 00:35:41.792936 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/078376fd-a0f8-4157-8a07-23ce85695dc6-secret-default-prometheus-proxy-tls podName:078376fd-a0f8-4157-8a07-23ce85695dc6 nodeName:}" failed. No retries permitted until 2026-03-16 00:35:42.792911607 +0000 UTC m=+1735.889211590 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "secret-default-prometheus-proxy-tls" (UniqueName: "kubernetes.io/secret/078376fd-a0f8-4157-8a07-23ce85695dc6-secret-default-prometheus-proxy-tls") pod "prometheus-default-0" (UID: "078376fd-a0f8-4157-8a07-23ce85695dc6") : secret "default-prometheus-proxy-tls" not found Mar 16 00:35:42 crc kubenswrapper[4816]: I0316 00:35:42.804804 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-default-prometheus-proxy-tls\" (UniqueName: \"kubernetes.io/secret/078376fd-a0f8-4157-8a07-23ce85695dc6-secret-default-prometheus-proxy-tls\") pod \"prometheus-default-0\" (UID: \"078376fd-a0f8-4157-8a07-23ce85695dc6\") " pod="service-telemetry/prometheus-default-0" Mar 16 00:35:42 crc kubenswrapper[4816]: I0316 00:35:42.814454 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-default-prometheus-proxy-tls\" (UniqueName: \"kubernetes.io/secret/078376fd-a0f8-4157-8a07-23ce85695dc6-secret-default-prometheus-proxy-tls\") pod \"prometheus-default-0\" (UID: \"078376fd-a0f8-4157-8a07-23ce85695dc6\") " pod="service-telemetry/prometheus-default-0" Mar 16 00:35:42 crc kubenswrapper[4816]: I0316 00:35:42.901050 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/prometheus-default-0" Mar 16 00:35:43 crc kubenswrapper[4816]: I0316 00:35:43.184809 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/prometheus-default-0"] Mar 16 00:35:43 crc kubenswrapper[4816]: I0316 00:35:43.423898 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/prometheus-default-0" event={"ID":"078376fd-a0f8-4157-8a07-23ce85695dc6","Type":"ContainerStarted","Data":"6bf29a457db28cca4c02bce3433651e24d35d93b2613dab3606cc5069dc54224"} Mar 16 00:35:48 crc kubenswrapper[4816]: I0316 00:35:48.480163 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/prometheus-default-0" event={"ID":"078376fd-a0f8-4157-8a07-23ce85695dc6","Type":"ContainerStarted","Data":"ad5d6ee1ddaedfce29f1a42c7783da7e5b45d684fe8d750e463ea53229b88e8e"} Mar 16 00:35:51 crc kubenswrapper[4816]: I0316 00:35:51.086623 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/default-snmp-webhook-6856cfb745-2n6hc"] Mar 16 00:35:51 crc kubenswrapper[4816]: I0316 00:35:51.087799 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/default-snmp-webhook-6856cfb745-2n6hc" Mar 16 00:35:51 crc kubenswrapper[4816]: I0316 00:35:51.119763 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rlhrl\" (UniqueName: \"kubernetes.io/projected/5be5ca83-5116-48dc-8d6c-733cbd3e9682-kube-api-access-rlhrl\") pod \"default-snmp-webhook-6856cfb745-2n6hc\" (UID: \"5be5ca83-5116-48dc-8d6c-733cbd3e9682\") " pod="service-telemetry/default-snmp-webhook-6856cfb745-2n6hc" Mar 16 00:35:51 crc kubenswrapper[4816]: I0316 00:35:51.138825 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/default-snmp-webhook-6856cfb745-2n6hc"] Mar 16 00:35:51 crc kubenswrapper[4816]: I0316 00:35:51.220458 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rlhrl\" (UniqueName: \"kubernetes.io/projected/5be5ca83-5116-48dc-8d6c-733cbd3e9682-kube-api-access-rlhrl\") pod \"default-snmp-webhook-6856cfb745-2n6hc\" (UID: \"5be5ca83-5116-48dc-8d6c-733cbd3e9682\") " pod="service-telemetry/default-snmp-webhook-6856cfb745-2n6hc" Mar 16 00:35:51 crc kubenswrapper[4816]: I0316 00:35:51.239775 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rlhrl\" (UniqueName: \"kubernetes.io/projected/5be5ca83-5116-48dc-8d6c-733cbd3e9682-kube-api-access-rlhrl\") pod \"default-snmp-webhook-6856cfb745-2n6hc\" (UID: \"5be5ca83-5116-48dc-8d6c-733cbd3e9682\") " pod="service-telemetry/default-snmp-webhook-6856cfb745-2n6hc" Mar 16 00:35:51 crc kubenswrapper[4816]: I0316 00:35:51.404522 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/default-snmp-webhook-6856cfb745-2n6hc" Mar 16 00:35:51 crc kubenswrapper[4816]: I0316 00:35:51.856675 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/default-snmp-webhook-6856cfb745-2n6hc"] Mar 16 00:35:52 crc kubenswrapper[4816]: I0316 00:35:52.508922 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-snmp-webhook-6856cfb745-2n6hc" event={"ID":"5be5ca83-5116-48dc-8d6c-733cbd3e9682","Type":"ContainerStarted","Data":"9a41732642c14a0983b9e80b2ccd79de5ab039325162e3f4d598cc4a04912d7f"} Mar 16 00:35:54 crc kubenswrapper[4816]: I0316 00:35:54.547582 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/alertmanager-default-0"] Mar 16 00:35:54 crc kubenswrapper[4816]: I0316 00:35:54.550836 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/alertmanager-default-0" Mar 16 00:35:54 crc kubenswrapper[4816]: I0316 00:35:54.553354 4816 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"alertmanager-default-web-config" Mar 16 00:35:54 crc kubenswrapper[4816]: I0316 00:35:54.553573 4816 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"alertmanager-default-tls-assets-0" Mar 16 00:35:54 crc kubenswrapper[4816]: I0316 00:35:54.554017 4816 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"alertmanager-default-cluster-tls-config" Mar 16 00:35:54 crc kubenswrapper[4816]: I0316 00:35:54.554148 4816 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"alertmanager-default-generated" Mar 16 00:35:54 crc kubenswrapper[4816]: I0316 00:35:54.554279 4816 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"default-alertmanager-proxy-tls" Mar 16 00:35:54 crc kubenswrapper[4816]: I0316 00:35:54.561189 4816 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"alertmanager-stf-dockercfg-tg86t" Mar 16 00:35:54 crc kubenswrapper[4816]: I0316 00:35:54.571848 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/alertmanager-default-0"] Mar 16 00:35:54 crc kubenswrapper[4816]: I0316 00:35:54.674244 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-82d58573-88f1-4f19-b2e3-271c3d59b363\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-82d58573-88f1-4f19-b2e3-271c3d59b363\") pod \"alertmanager-default-0\" (UID: \"f4698c34-c93e-4d6f-8ab8-2bfcf3118410\") " pod="service-telemetry/alertmanager-default-0" Mar 16 00:35:54 crc kubenswrapper[4816]: I0316 00:35:54.674283 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/f4698c34-c93e-4d6f-8ab8-2bfcf3118410-config-out\") pod \"alertmanager-default-0\" (UID: \"f4698c34-c93e-4d6f-8ab8-2bfcf3118410\") " pod="service-telemetry/alertmanager-default-0" Mar 16 00:35:54 crc kubenswrapper[4816]: I0316 00:35:54.674343 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/f4698c34-c93e-4d6f-8ab8-2bfcf3118410-web-config\") pod \"alertmanager-default-0\" (UID: \"f4698c34-c93e-4d6f-8ab8-2bfcf3118410\") " pod="service-telemetry/alertmanager-default-0" Mar 16 00:35:54 crc kubenswrapper[4816]: I0316 00:35:54.674363 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/f4698c34-c93e-4d6f-8ab8-2bfcf3118410-config-volume\") pod \"alertmanager-default-0\" (UID: \"f4698c34-c93e-4d6f-8ab8-2bfcf3118410\") " pod="service-telemetry/alertmanager-default-0" Mar 16 00:35:54 crc kubenswrapper[4816]: I0316 00:35:54.674380 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/f4698c34-c93e-4d6f-8ab8-2bfcf3118410-cluster-tls-config\") pod \"alertmanager-default-0\" (UID: \"f4698c34-c93e-4d6f-8ab8-2bfcf3118410\") " pod="service-telemetry/alertmanager-default-0" Mar 16 00:35:54 crc kubenswrapper[4816]: I0316 00:35:54.674432 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j75xg\" (UniqueName: \"kubernetes.io/projected/f4698c34-c93e-4d6f-8ab8-2bfcf3118410-kube-api-access-j75xg\") pod \"alertmanager-default-0\" (UID: \"f4698c34-c93e-4d6f-8ab8-2bfcf3118410\") " pod="service-telemetry/alertmanager-default-0" Mar 16 00:35:54 crc kubenswrapper[4816]: I0316 00:35:54.674491 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/f4698c34-c93e-4d6f-8ab8-2bfcf3118410-tls-assets\") pod \"alertmanager-default-0\" (UID: \"f4698c34-c93e-4d6f-8ab8-2bfcf3118410\") " pod="service-telemetry/alertmanager-default-0" Mar 16 00:35:54 crc kubenswrapper[4816]: I0316 00:35:54.674513 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-default-alertmanager-proxy-tls\" (UniqueName: \"kubernetes.io/secret/f4698c34-c93e-4d6f-8ab8-2bfcf3118410-secret-default-alertmanager-proxy-tls\") pod \"alertmanager-default-0\" (UID: \"f4698c34-c93e-4d6f-8ab8-2bfcf3118410\") " pod="service-telemetry/alertmanager-default-0" Mar 16 00:35:54 crc kubenswrapper[4816]: I0316 00:35:54.674567 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-default-session-secret\" (UniqueName: \"kubernetes.io/secret/f4698c34-c93e-4d6f-8ab8-2bfcf3118410-secret-default-session-secret\") pod \"alertmanager-default-0\" (UID: \"f4698c34-c93e-4d6f-8ab8-2bfcf3118410\") " pod="service-telemetry/alertmanager-default-0" Mar 16 00:35:54 crc kubenswrapper[4816]: I0316 00:35:54.775502 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/f4698c34-c93e-4d6f-8ab8-2bfcf3118410-tls-assets\") pod \"alertmanager-default-0\" (UID: \"f4698c34-c93e-4d6f-8ab8-2bfcf3118410\") " pod="service-telemetry/alertmanager-default-0" Mar 16 00:35:54 crc kubenswrapper[4816]: I0316 00:35:54.775560 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-default-alertmanager-proxy-tls\" (UniqueName: \"kubernetes.io/secret/f4698c34-c93e-4d6f-8ab8-2bfcf3118410-secret-default-alertmanager-proxy-tls\") pod \"alertmanager-default-0\" (UID: \"f4698c34-c93e-4d6f-8ab8-2bfcf3118410\") " pod="service-telemetry/alertmanager-default-0" Mar 16 00:35:54 crc kubenswrapper[4816]: I0316 00:35:54.775583 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-default-session-secret\" (UniqueName: \"kubernetes.io/secret/f4698c34-c93e-4d6f-8ab8-2bfcf3118410-secret-default-session-secret\") pod \"alertmanager-default-0\" (UID: \"f4698c34-c93e-4d6f-8ab8-2bfcf3118410\") " pod="service-telemetry/alertmanager-default-0" Mar 16 00:35:54 crc kubenswrapper[4816]: I0316 00:35:54.775648 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-82d58573-88f1-4f19-b2e3-271c3d59b363\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-82d58573-88f1-4f19-b2e3-271c3d59b363\") pod \"alertmanager-default-0\" (UID: \"f4698c34-c93e-4d6f-8ab8-2bfcf3118410\") " pod="service-telemetry/alertmanager-default-0" Mar 16 00:35:54 crc kubenswrapper[4816]: I0316 00:35:54.775663 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/f4698c34-c93e-4d6f-8ab8-2bfcf3118410-config-out\") pod \"alertmanager-default-0\" (UID: \"f4698c34-c93e-4d6f-8ab8-2bfcf3118410\") " pod="service-telemetry/alertmanager-default-0" Mar 16 00:35:54 crc kubenswrapper[4816]: I0316 00:35:54.775706 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/f4698c34-c93e-4d6f-8ab8-2bfcf3118410-web-config\") pod \"alertmanager-default-0\" (UID: \"f4698c34-c93e-4d6f-8ab8-2bfcf3118410\") " pod="service-telemetry/alertmanager-default-0" Mar 16 00:35:54 crc kubenswrapper[4816]: I0316 00:35:54.775733 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/f4698c34-c93e-4d6f-8ab8-2bfcf3118410-config-volume\") pod \"alertmanager-default-0\" (UID: \"f4698c34-c93e-4d6f-8ab8-2bfcf3118410\") " pod="service-telemetry/alertmanager-default-0" Mar 16 00:35:54 crc kubenswrapper[4816]: I0316 00:35:54.775766 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/f4698c34-c93e-4d6f-8ab8-2bfcf3118410-cluster-tls-config\") pod \"alertmanager-default-0\" (UID: \"f4698c34-c93e-4d6f-8ab8-2bfcf3118410\") " pod="service-telemetry/alertmanager-default-0" Mar 16 00:35:54 crc kubenswrapper[4816]: I0316 00:35:54.775787 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j75xg\" (UniqueName: \"kubernetes.io/projected/f4698c34-c93e-4d6f-8ab8-2bfcf3118410-kube-api-access-j75xg\") pod \"alertmanager-default-0\" (UID: \"f4698c34-c93e-4d6f-8ab8-2bfcf3118410\") " pod="service-telemetry/alertmanager-default-0" Mar 16 00:35:54 crc kubenswrapper[4816]: E0316 00:35:54.776458 4816 secret.go:188] Couldn't get secret service-telemetry/default-alertmanager-proxy-tls: secret "default-alertmanager-proxy-tls" not found Mar 16 00:35:54 crc kubenswrapper[4816]: E0316 00:35:54.776506 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f4698c34-c93e-4d6f-8ab8-2bfcf3118410-secret-default-alertmanager-proxy-tls podName:f4698c34-c93e-4d6f-8ab8-2bfcf3118410 nodeName:}" failed. No retries permitted until 2026-03-16 00:35:55.276491789 +0000 UTC m=+1748.372791742 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "secret-default-alertmanager-proxy-tls" (UniqueName: "kubernetes.io/secret/f4698c34-c93e-4d6f-8ab8-2bfcf3118410-secret-default-alertmanager-proxy-tls") pod "alertmanager-default-0" (UID: "f4698c34-c93e-4d6f-8ab8-2bfcf3118410") : secret "default-alertmanager-proxy-tls" not found Mar 16 00:35:54 crc kubenswrapper[4816]: I0316 00:35:54.781618 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/f4698c34-c93e-4d6f-8ab8-2bfcf3118410-web-config\") pod \"alertmanager-default-0\" (UID: \"f4698c34-c93e-4d6f-8ab8-2bfcf3118410\") " pod="service-telemetry/alertmanager-default-0" Mar 16 00:35:54 crc kubenswrapper[4816]: I0316 00:35:54.782192 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/f4698c34-c93e-4d6f-8ab8-2bfcf3118410-tls-assets\") pod \"alertmanager-default-0\" (UID: \"f4698c34-c93e-4d6f-8ab8-2bfcf3118410\") " pod="service-telemetry/alertmanager-default-0" Mar 16 00:35:54 crc kubenswrapper[4816]: I0316 00:35:54.784100 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/f4698c34-c93e-4d6f-8ab8-2bfcf3118410-config-volume\") pod \"alertmanager-default-0\" (UID: \"f4698c34-c93e-4d6f-8ab8-2bfcf3118410\") " pod="service-telemetry/alertmanager-default-0" Mar 16 00:35:54 crc kubenswrapper[4816]: I0316 00:35:54.784534 4816 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 16 00:35:54 crc kubenswrapper[4816]: I0316 00:35:54.784578 4816 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-82d58573-88f1-4f19-b2e3-271c3d59b363\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-82d58573-88f1-4f19-b2e3-271c3d59b363\") pod \"alertmanager-default-0\" (UID: \"f4698c34-c93e-4d6f-8ab8-2bfcf3118410\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/4ee44c0f5a301a68cae099f4f571c2cc1fd272d329d5eaa334a496c29057408e/globalmount\"" pod="service-telemetry/alertmanager-default-0" Mar 16 00:35:54 crc kubenswrapper[4816]: I0316 00:35:54.785763 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-default-session-secret\" (UniqueName: \"kubernetes.io/secret/f4698c34-c93e-4d6f-8ab8-2bfcf3118410-secret-default-session-secret\") pod \"alertmanager-default-0\" (UID: \"f4698c34-c93e-4d6f-8ab8-2bfcf3118410\") " pod="service-telemetry/alertmanager-default-0" Mar 16 00:35:54 crc kubenswrapper[4816]: I0316 00:35:54.786133 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/f4698c34-c93e-4d6f-8ab8-2bfcf3118410-cluster-tls-config\") pod \"alertmanager-default-0\" (UID: \"f4698c34-c93e-4d6f-8ab8-2bfcf3118410\") " pod="service-telemetry/alertmanager-default-0" Mar 16 00:35:54 crc kubenswrapper[4816]: I0316 00:35:54.792761 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/f4698c34-c93e-4d6f-8ab8-2bfcf3118410-config-out\") pod \"alertmanager-default-0\" (UID: \"f4698c34-c93e-4d6f-8ab8-2bfcf3118410\") " pod="service-telemetry/alertmanager-default-0" Mar 16 00:35:54 crc kubenswrapper[4816]: I0316 00:35:54.795033 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j75xg\" (UniqueName: \"kubernetes.io/projected/f4698c34-c93e-4d6f-8ab8-2bfcf3118410-kube-api-access-j75xg\") pod \"alertmanager-default-0\" (UID: \"f4698c34-c93e-4d6f-8ab8-2bfcf3118410\") " pod="service-telemetry/alertmanager-default-0" Mar 16 00:35:54 crc kubenswrapper[4816]: I0316 00:35:54.824703 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-82d58573-88f1-4f19-b2e3-271c3d59b363\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-82d58573-88f1-4f19-b2e3-271c3d59b363\") pod \"alertmanager-default-0\" (UID: \"f4698c34-c93e-4d6f-8ab8-2bfcf3118410\") " pod="service-telemetry/alertmanager-default-0" Mar 16 00:35:55 crc kubenswrapper[4816]: I0316 00:35:55.283250 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-default-alertmanager-proxy-tls\" (UniqueName: \"kubernetes.io/secret/f4698c34-c93e-4d6f-8ab8-2bfcf3118410-secret-default-alertmanager-proxy-tls\") pod \"alertmanager-default-0\" (UID: \"f4698c34-c93e-4d6f-8ab8-2bfcf3118410\") " pod="service-telemetry/alertmanager-default-0" Mar 16 00:35:55 crc kubenswrapper[4816]: E0316 00:35:55.283446 4816 secret.go:188] Couldn't get secret service-telemetry/default-alertmanager-proxy-tls: secret "default-alertmanager-proxy-tls" not found Mar 16 00:35:55 crc kubenswrapper[4816]: E0316 00:35:55.283530 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f4698c34-c93e-4d6f-8ab8-2bfcf3118410-secret-default-alertmanager-proxy-tls podName:f4698c34-c93e-4d6f-8ab8-2bfcf3118410 nodeName:}" failed. No retries permitted until 2026-03-16 00:35:56.283512506 +0000 UTC m=+1749.379812449 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "secret-default-alertmanager-proxy-tls" (UniqueName: "kubernetes.io/secret/f4698c34-c93e-4d6f-8ab8-2bfcf3118410-secret-default-alertmanager-proxy-tls") pod "alertmanager-default-0" (UID: "f4698c34-c93e-4d6f-8ab8-2bfcf3118410") : secret "default-alertmanager-proxy-tls" not found Mar 16 00:35:55 crc kubenswrapper[4816]: I0316 00:35:55.534355 4816 generic.go:334] "Generic (PLEG): container finished" podID="078376fd-a0f8-4157-8a07-23ce85695dc6" containerID="ad5d6ee1ddaedfce29f1a42c7783da7e5b45d684fe8d750e463ea53229b88e8e" exitCode=0 Mar 16 00:35:55 crc kubenswrapper[4816]: I0316 00:35:55.534396 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/prometheus-default-0" event={"ID":"078376fd-a0f8-4157-8a07-23ce85695dc6","Type":"ContainerDied","Data":"ad5d6ee1ddaedfce29f1a42c7783da7e5b45d684fe8d750e463ea53229b88e8e"} Mar 16 00:35:56 crc kubenswrapper[4816]: I0316 00:35:56.296759 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-default-alertmanager-proxy-tls\" (UniqueName: \"kubernetes.io/secret/f4698c34-c93e-4d6f-8ab8-2bfcf3118410-secret-default-alertmanager-proxy-tls\") pod \"alertmanager-default-0\" (UID: \"f4698c34-c93e-4d6f-8ab8-2bfcf3118410\") " pod="service-telemetry/alertmanager-default-0" Mar 16 00:35:56 crc kubenswrapper[4816]: E0316 00:35:56.296927 4816 secret.go:188] Couldn't get secret service-telemetry/default-alertmanager-proxy-tls: secret "default-alertmanager-proxy-tls" not found Mar 16 00:35:56 crc kubenswrapper[4816]: E0316 00:35:56.297113 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f4698c34-c93e-4d6f-8ab8-2bfcf3118410-secret-default-alertmanager-proxy-tls podName:f4698c34-c93e-4d6f-8ab8-2bfcf3118410 nodeName:}" failed. No retries permitted until 2026-03-16 00:35:58.297091637 +0000 UTC m=+1751.393391590 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "secret-default-alertmanager-proxy-tls" (UniqueName: "kubernetes.io/secret/f4698c34-c93e-4d6f-8ab8-2bfcf3118410-secret-default-alertmanager-proxy-tls") pod "alertmanager-default-0" (UID: "f4698c34-c93e-4d6f-8ab8-2bfcf3118410") : secret "default-alertmanager-proxy-tls" not found Mar 16 00:35:58 crc kubenswrapper[4816]: I0316 00:35:58.325627 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-default-alertmanager-proxy-tls\" (UniqueName: \"kubernetes.io/secret/f4698c34-c93e-4d6f-8ab8-2bfcf3118410-secret-default-alertmanager-proxy-tls\") pod \"alertmanager-default-0\" (UID: \"f4698c34-c93e-4d6f-8ab8-2bfcf3118410\") " pod="service-telemetry/alertmanager-default-0" Mar 16 00:35:58 crc kubenswrapper[4816]: I0316 00:35:58.335441 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-default-alertmanager-proxy-tls\" (UniqueName: \"kubernetes.io/secret/f4698c34-c93e-4d6f-8ab8-2bfcf3118410-secret-default-alertmanager-proxy-tls\") pod \"alertmanager-default-0\" (UID: \"f4698c34-c93e-4d6f-8ab8-2bfcf3118410\") " pod="service-telemetry/alertmanager-default-0" Mar 16 00:35:58 crc kubenswrapper[4816]: I0316 00:35:58.477872 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/alertmanager-default-0" Mar 16 00:35:59 crc kubenswrapper[4816]: I0316 00:35:59.271731 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/alertmanager-default-0"] Mar 16 00:35:59 crc kubenswrapper[4816]: W0316 00:35:59.275072 4816 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf4698c34_c93e_4d6f_8ab8_2bfcf3118410.slice/crio-79cc4306d089725a98fc0e8dd5dbf2e63a5f38f1a2fb06a495d7a2bcb49ed0d7 WatchSource:0}: Error finding container 79cc4306d089725a98fc0e8dd5dbf2e63a5f38f1a2fb06a495d7a2bcb49ed0d7: Status 404 returned error can't find the container with id 79cc4306d089725a98fc0e8dd5dbf2e63a5f38f1a2fb06a495d7a2bcb49ed0d7 Mar 16 00:35:59 crc kubenswrapper[4816]: I0316 00:35:59.576353 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-snmp-webhook-6856cfb745-2n6hc" event={"ID":"5be5ca83-5116-48dc-8d6c-733cbd3e9682","Type":"ContainerStarted","Data":"1d2f4d2e87dd59ce19fa126c3b86c4e809d1d1b48ecda31a05892b90fe315aa5"} Mar 16 00:35:59 crc kubenswrapper[4816]: I0316 00:35:59.577254 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/alertmanager-default-0" event={"ID":"f4698c34-c93e-4d6f-8ab8-2bfcf3118410","Type":"ContainerStarted","Data":"79cc4306d089725a98fc0e8dd5dbf2e63a5f38f1a2fb06a495d7a2bcb49ed0d7"} Mar 16 00:35:59 crc kubenswrapper[4816]: I0316 00:35:59.594128 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/default-snmp-webhook-6856cfb745-2n6hc" podStartSLOduration=1.244467202 podStartE2EDuration="8.59409859s" podCreationTimestamp="2026-03-16 00:35:51 +0000 UTC" firstStartedPulling="2026-03-16 00:35:51.86165709 +0000 UTC m=+1744.957957053" lastFinishedPulling="2026-03-16 00:35:59.211288488 +0000 UTC m=+1752.307588441" observedRunningTime="2026-03-16 00:35:59.587874484 +0000 UTC m=+1752.684174437" watchObservedRunningTime="2026-03-16 00:35:59.59409859 +0000 UTC m=+1752.690398543" Mar 16 00:36:00 crc kubenswrapper[4816]: I0316 00:36:00.128490 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29560356-vl86r"] Mar 16 00:36:00 crc kubenswrapper[4816]: I0316 00:36:00.129475 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29560356-vl86r" Mar 16 00:36:00 crc kubenswrapper[4816]: I0316 00:36:00.132454 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 16 00:36:00 crc kubenswrapper[4816]: I0316 00:36:00.132525 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-8hc2r" Mar 16 00:36:00 crc kubenswrapper[4816]: I0316 00:36:00.132454 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 16 00:36:00 crc kubenswrapper[4816]: I0316 00:36:00.141038 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29560356-vl86r"] Mar 16 00:36:00 crc kubenswrapper[4816]: I0316 00:36:00.191418 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-492kg\" (UniqueName: \"kubernetes.io/projected/0ffa2c59-99b2-4d5c-9c19-f0921ce688cb-kube-api-access-492kg\") pod \"auto-csr-approver-29560356-vl86r\" (UID: \"0ffa2c59-99b2-4d5c-9c19-f0921ce688cb\") " pod="openshift-infra/auto-csr-approver-29560356-vl86r" Mar 16 00:36:00 crc kubenswrapper[4816]: I0316 00:36:00.293022 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-492kg\" (UniqueName: \"kubernetes.io/projected/0ffa2c59-99b2-4d5c-9c19-f0921ce688cb-kube-api-access-492kg\") pod \"auto-csr-approver-29560356-vl86r\" (UID: \"0ffa2c59-99b2-4d5c-9c19-f0921ce688cb\") " pod="openshift-infra/auto-csr-approver-29560356-vl86r" Mar 16 00:36:00 crc kubenswrapper[4816]: I0316 00:36:00.315195 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-492kg\" (UniqueName: \"kubernetes.io/projected/0ffa2c59-99b2-4d5c-9c19-f0921ce688cb-kube-api-access-492kg\") pod \"auto-csr-approver-29560356-vl86r\" (UID: \"0ffa2c59-99b2-4d5c-9c19-f0921ce688cb\") " pod="openshift-infra/auto-csr-approver-29560356-vl86r" Mar 16 00:36:00 crc kubenswrapper[4816]: I0316 00:36:00.508931 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29560356-vl86r" Mar 16 00:36:01 crc kubenswrapper[4816]: I0316 00:36:01.592235 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/alertmanager-default-0" event={"ID":"f4698c34-c93e-4d6f-8ab8-2bfcf3118410","Type":"ContainerStarted","Data":"ee1e85a329cf517225a11b56c35c0a2882b2081d4fc2c84d9ff8eeaf4e90e1da"} Mar 16 00:36:01 crc kubenswrapper[4816]: I0316 00:36:01.863089 4816 patch_prober.go:28] interesting pod/machine-config-daemon-jrdcz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 16 00:36:01 crc kubenswrapper[4816]: I0316 00:36:01.863150 4816 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jrdcz" podUID="dd08ece2-7636-4966-973a-e96a34b70b53" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 16 00:36:03 crc kubenswrapper[4816]: I0316 00:36:03.175062 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29560356-vl86r"] Mar 16 00:36:03 crc kubenswrapper[4816]: I0316 00:36:03.608175 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/prometheus-default-0" event={"ID":"078376fd-a0f8-4157-8a07-23ce85695dc6","Type":"ContainerStarted","Data":"732049fbfa9d0a365a241a1d58f29d95ffbd1ab9e8ed904c2672dbf9e157a3e0"} Mar 16 00:36:03 crc kubenswrapper[4816]: I0316 00:36:03.609763 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29560356-vl86r" event={"ID":"0ffa2c59-99b2-4d5c-9c19-f0921ce688cb","Type":"ContainerStarted","Data":"7438600c9a8d11afa86633eb3a1d44cf1dfa3d95de38221bd7c8d0e7539f9b23"} Mar 16 00:36:04 crc kubenswrapper[4816]: I0316 00:36:04.619328 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29560356-vl86r" event={"ID":"0ffa2c59-99b2-4d5c-9c19-f0921ce688cb","Type":"ContainerStarted","Data":"a31a7a7d2a1fafc20c9ab619317cd87262faab560369319826f6c184261023c4"} Mar 16 00:36:04 crc kubenswrapper[4816]: I0316 00:36:04.645733 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29560356-vl86r" podStartSLOduration=3.676952788 podStartE2EDuration="4.645709474s" podCreationTimestamp="2026-03-16 00:36:00 +0000 UTC" firstStartedPulling="2026-03-16 00:36:03.180700865 +0000 UTC m=+1756.277000818" lastFinishedPulling="2026-03-16 00:36:04.149457531 +0000 UTC m=+1757.245757504" observedRunningTime="2026-03-16 00:36:04.634623551 +0000 UTC m=+1757.730923514" watchObservedRunningTime="2026-03-16 00:36:04.645709474 +0000 UTC m=+1757.742009427" Mar 16 00:36:05 crc kubenswrapper[4816]: I0316 00:36:05.628184 4816 generic.go:334] "Generic (PLEG): container finished" podID="0ffa2c59-99b2-4d5c-9c19-f0921ce688cb" containerID="a31a7a7d2a1fafc20c9ab619317cd87262faab560369319826f6c184261023c4" exitCode=0 Mar 16 00:36:05 crc kubenswrapper[4816]: I0316 00:36:05.628289 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29560356-vl86r" event={"ID":"0ffa2c59-99b2-4d5c-9c19-f0921ce688cb","Type":"ContainerDied","Data":"a31a7a7d2a1fafc20c9ab619317cd87262faab560369319826f6c184261023c4"} Mar 16 00:36:05 crc kubenswrapper[4816]: I0316 00:36:05.634177 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/prometheus-default-0" event={"ID":"078376fd-a0f8-4157-8a07-23ce85695dc6","Type":"ContainerStarted","Data":"f829b387c5d2c80c4804c84b02a83f69b186ab0515fa16f884633c0155dce723"} Mar 16 00:36:06 crc kubenswrapper[4816]: I0316 00:36:06.905848 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29560356-vl86r" Mar 16 00:36:07 crc kubenswrapper[4816]: I0316 00:36:07.088097 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-492kg\" (UniqueName: \"kubernetes.io/projected/0ffa2c59-99b2-4d5c-9c19-f0921ce688cb-kube-api-access-492kg\") pod \"0ffa2c59-99b2-4d5c-9c19-f0921ce688cb\" (UID: \"0ffa2c59-99b2-4d5c-9c19-f0921ce688cb\") " Mar 16 00:36:07 crc kubenswrapper[4816]: I0316 00:36:07.107748 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0ffa2c59-99b2-4d5c-9c19-f0921ce688cb-kube-api-access-492kg" (OuterVolumeSpecName: "kube-api-access-492kg") pod "0ffa2c59-99b2-4d5c-9c19-f0921ce688cb" (UID: "0ffa2c59-99b2-4d5c-9c19-f0921ce688cb"). InnerVolumeSpecName "kube-api-access-492kg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:36:07 crc kubenswrapper[4816]: I0316 00:36:07.189699 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-492kg\" (UniqueName: \"kubernetes.io/projected/0ffa2c59-99b2-4d5c-9c19-f0921ce688cb-kube-api-access-492kg\") on node \"crc\" DevicePath \"\"" Mar 16 00:36:07 crc kubenswrapper[4816]: I0316 00:36:07.650400 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29560356-vl86r" event={"ID":"0ffa2c59-99b2-4d5c-9c19-f0921ce688cb","Type":"ContainerDied","Data":"7438600c9a8d11afa86633eb3a1d44cf1dfa3d95de38221bd7c8d0e7539f9b23"} Mar 16 00:36:07 crc kubenswrapper[4816]: I0316 00:36:07.650491 4816 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7438600c9a8d11afa86633eb3a1d44cf1dfa3d95de38221bd7c8d0e7539f9b23" Mar 16 00:36:07 crc kubenswrapper[4816]: I0316 00:36:07.652944 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29560356-vl86r" Mar 16 00:36:07 crc kubenswrapper[4816]: I0316 00:36:07.652974 4816 generic.go:334] "Generic (PLEG): container finished" podID="f4698c34-c93e-4d6f-8ab8-2bfcf3118410" containerID="ee1e85a329cf517225a11b56c35c0a2882b2081d4fc2c84d9ff8eeaf4e90e1da" exitCode=0 Mar 16 00:36:07 crc kubenswrapper[4816]: I0316 00:36:07.652997 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/alertmanager-default-0" event={"ID":"f4698c34-c93e-4d6f-8ab8-2bfcf3118410","Type":"ContainerDied","Data":"ee1e85a329cf517225a11b56c35c0a2882b2081d4fc2c84d9ff8eeaf4e90e1da"} Mar 16 00:36:07 crc kubenswrapper[4816]: I0316 00:36:07.701804 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29560350-6dpp5"] Mar 16 00:36:07 crc kubenswrapper[4816]: I0316 00:36:07.706708 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29560350-6dpp5"] Mar 16 00:36:07 crc kubenswrapper[4816]: I0316 00:36:07.791978 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-m8n7t"] Mar 16 00:36:07 crc kubenswrapper[4816]: E0316 00:36:07.792307 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0ffa2c59-99b2-4d5c-9c19-f0921ce688cb" containerName="oc" Mar 16 00:36:07 crc kubenswrapper[4816]: I0316 00:36:07.792321 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="0ffa2c59-99b2-4d5c-9c19-f0921ce688cb" containerName="oc" Mar 16 00:36:07 crc kubenswrapper[4816]: I0316 00:36:07.792483 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="0ffa2c59-99b2-4d5c-9c19-f0921ce688cb" containerName="oc" Mar 16 00:36:07 crc kubenswrapper[4816]: I0316 00:36:07.793368 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-m8n7t" Mar 16 00:36:07 crc kubenswrapper[4816]: I0316 00:36:07.795480 4816 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"smart-gateway-dockercfg-kjf9t" Mar 16 00:36:07 crc kubenswrapper[4816]: I0316 00:36:07.795802 4816 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"smart-gateway-session-secret" Mar 16 00:36:07 crc kubenswrapper[4816]: I0316 00:36:07.795906 4816 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"default-cloud1-coll-meter-proxy-tls" Mar 16 00:36:07 crc kubenswrapper[4816]: I0316 00:36:07.797252 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"default-cloud1-coll-meter-sg-core-configmap" Mar 16 00:36:07 crc kubenswrapper[4816]: I0316 00:36:07.804051 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-m8n7t"] Mar 16 00:36:07 crc kubenswrapper[4816]: I0316 00:36:07.900562 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"session-secret\" (UniqueName: \"kubernetes.io/secret/4de6f751-2471-4ce9-a771-00703e7be02a-session-secret\") pod \"default-cloud1-coll-meter-smartgateway-7cd87f9766-m8n7t\" (UID: \"4de6f751-2471-4ce9-a771-00703e7be02a\") " pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-m8n7t" Mar 16 00:36:07 crc kubenswrapper[4816]: I0316 00:36:07.900616 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/empty-dir/4de6f751-2471-4ce9-a771-00703e7be02a-socket-dir\") pod \"default-cloud1-coll-meter-smartgateway-7cd87f9766-m8n7t\" (UID: \"4de6f751-2471-4ce9-a771-00703e7be02a\") " pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-m8n7t" Mar 16 00:36:07 crc kubenswrapper[4816]: I0316 00:36:07.900690 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-config\" (UniqueName: \"kubernetes.io/configmap/4de6f751-2471-4ce9-a771-00703e7be02a-sg-core-config\") pod \"default-cloud1-coll-meter-smartgateway-7cd87f9766-m8n7t\" (UID: \"4de6f751-2471-4ce9-a771-00703e7be02a\") " pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-m8n7t" Mar 16 00:36:07 crc kubenswrapper[4816]: I0316 00:36:07.900710 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w65dq\" (UniqueName: \"kubernetes.io/projected/4de6f751-2471-4ce9-a771-00703e7be02a-kube-api-access-w65dq\") pod \"default-cloud1-coll-meter-smartgateway-7cd87f9766-m8n7t\" (UID: \"4de6f751-2471-4ce9-a771-00703e7be02a\") " pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-m8n7t" Mar 16 00:36:07 crc kubenswrapper[4816]: I0316 00:36:07.900775 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-cloud1-coll-meter-proxy-tls\" (UniqueName: \"kubernetes.io/secret/4de6f751-2471-4ce9-a771-00703e7be02a-default-cloud1-coll-meter-proxy-tls\") pod \"default-cloud1-coll-meter-smartgateway-7cd87f9766-m8n7t\" (UID: \"4de6f751-2471-4ce9-a771-00703e7be02a\") " pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-m8n7t" Mar 16 00:36:08 crc kubenswrapper[4816]: I0316 00:36:08.001520 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-config\" (UniqueName: \"kubernetes.io/configmap/4de6f751-2471-4ce9-a771-00703e7be02a-sg-core-config\") pod \"default-cloud1-coll-meter-smartgateway-7cd87f9766-m8n7t\" (UID: \"4de6f751-2471-4ce9-a771-00703e7be02a\") " pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-m8n7t" Mar 16 00:36:08 crc kubenswrapper[4816]: I0316 00:36:08.001579 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w65dq\" (UniqueName: \"kubernetes.io/projected/4de6f751-2471-4ce9-a771-00703e7be02a-kube-api-access-w65dq\") pod \"default-cloud1-coll-meter-smartgateway-7cd87f9766-m8n7t\" (UID: \"4de6f751-2471-4ce9-a771-00703e7be02a\") " pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-m8n7t" Mar 16 00:36:08 crc kubenswrapper[4816]: I0316 00:36:08.001627 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-cloud1-coll-meter-proxy-tls\" (UniqueName: \"kubernetes.io/secret/4de6f751-2471-4ce9-a771-00703e7be02a-default-cloud1-coll-meter-proxy-tls\") pod \"default-cloud1-coll-meter-smartgateway-7cd87f9766-m8n7t\" (UID: \"4de6f751-2471-4ce9-a771-00703e7be02a\") " pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-m8n7t" Mar 16 00:36:08 crc kubenswrapper[4816]: I0316 00:36:08.001661 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"session-secret\" (UniqueName: \"kubernetes.io/secret/4de6f751-2471-4ce9-a771-00703e7be02a-session-secret\") pod \"default-cloud1-coll-meter-smartgateway-7cd87f9766-m8n7t\" (UID: \"4de6f751-2471-4ce9-a771-00703e7be02a\") " pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-m8n7t" Mar 16 00:36:08 crc kubenswrapper[4816]: I0316 00:36:08.001684 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/empty-dir/4de6f751-2471-4ce9-a771-00703e7be02a-socket-dir\") pod \"default-cloud1-coll-meter-smartgateway-7cd87f9766-m8n7t\" (UID: \"4de6f751-2471-4ce9-a771-00703e7be02a\") " pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-m8n7t" Mar 16 00:36:08 crc kubenswrapper[4816]: E0316 00:36:08.002055 4816 secret.go:188] Couldn't get secret service-telemetry/default-cloud1-coll-meter-proxy-tls: secret "default-cloud1-coll-meter-proxy-tls" not found Mar 16 00:36:08 crc kubenswrapper[4816]: E0316 00:36:08.002108 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4de6f751-2471-4ce9-a771-00703e7be02a-default-cloud1-coll-meter-proxy-tls podName:4de6f751-2471-4ce9-a771-00703e7be02a nodeName:}" failed. No retries permitted until 2026-03-16 00:36:08.502093062 +0000 UTC m=+1761.598393015 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "default-cloud1-coll-meter-proxy-tls" (UniqueName: "kubernetes.io/secret/4de6f751-2471-4ce9-a771-00703e7be02a-default-cloud1-coll-meter-proxy-tls") pod "default-cloud1-coll-meter-smartgateway-7cd87f9766-m8n7t" (UID: "4de6f751-2471-4ce9-a771-00703e7be02a") : secret "default-cloud1-coll-meter-proxy-tls" not found Mar 16 00:36:08 crc kubenswrapper[4816]: I0316 00:36:08.002180 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/empty-dir/4de6f751-2471-4ce9-a771-00703e7be02a-socket-dir\") pod \"default-cloud1-coll-meter-smartgateway-7cd87f9766-m8n7t\" (UID: \"4de6f751-2471-4ce9-a771-00703e7be02a\") " pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-m8n7t" Mar 16 00:36:08 crc kubenswrapper[4816]: I0316 00:36:08.002838 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-config\" (UniqueName: \"kubernetes.io/configmap/4de6f751-2471-4ce9-a771-00703e7be02a-sg-core-config\") pod \"default-cloud1-coll-meter-smartgateway-7cd87f9766-m8n7t\" (UID: \"4de6f751-2471-4ce9-a771-00703e7be02a\") " pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-m8n7t" Mar 16 00:36:08 crc kubenswrapper[4816]: I0316 00:36:08.006593 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"session-secret\" (UniqueName: \"kubernetes.io/secret/4de6f751-2471-4ce9-a771-00703e7be02a-session-secret\") pod \"default-cloud1-coll-meter-smartgateway-7cd87f9766-m8n7t\" (UID: \"4de6f751-2471-4ce9-a771-00703e7be02a\") " pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-m8n7t" Mar 16 00:36:08 crc kubenswrapper[4816]: I0316 00:36:08.017110 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w65dq\" (UniqueName: \"kubernetes.io/projected/4de6f751-2471-4ce9-a771-00703e7be02a-kube-api-access-w65dq\") pod \"default-cloud1-coll-meter-smartgateway-7cd87f9766-m8n7t\" (UID: \"4de6f751-2471-4ce9-a771-00703e7be02a\") " pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-m8n7t" Mar 16 00:36:08 crc kubenswrapper[4816]: I0316 00:36:08.508471 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-cloud1-coll-meter-proxy-tls\" (UniqueName: \"kubernetes.io/secret/4de6f751-2471-4ce9-a771-00703e7be02a-default-cloud1-coll-meter-proxy-tls\") pod \"default-cloud1-coll-meter-smartgateway-7cd87f9766-m8n7t\" (UID: \"4de6f751-2471-4ce9-a771-00703e7be02a\") " pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-m8n7t" Mar 16 00:36:08 crc kubenswrapper[4816]: E0316 00:36:08.508689 4816 secret.go:188] Couldn't get secret service-telemetry/default-cloud1-coll-meter-proxy-tls: secret "default-cloud1-coll-meter-proxy-tls" not found Mar 16 00:36:08 crc kubenswrapper[4816]: E0316 00:36:08.508955 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4de6f751-2471-4ce9-a771-00703e7be02a-default-cloud1-coll-meter-proxy-tls podName:4de6f751-2471-4ce9-a771-00703e7be02a nodeName:}" failed. No retries permitted until 2026-03-16 00:36:09.508935854 +0000 UTC m=+1762.605235807 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "default-cloud1-coll-meter-proxy-tls" (UniqueName: "kubernetes.io/secret/4de6f751-2471-4ce9-a771-00703e7be02a-default-cloud1-coll-meter-proxy-tls") pod "default-cloud1-coll-meter-smartgateway-7cd87f9766-m8n7t" (UID: "4de6f751-2471-4ce9-a771-00703e7be02a") : secret "default-cloud1-coll-meter-proxy-tls" not found Mar 16 00:36:09 crc kubenswrapper[4816]: I0316 00:36:09.521387 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-cloud1-coll-meter-proxy-tls\" (UniqueName: \"kubernetes.io/secret/4de6f751-2471-4ce9-a771-00703e7be02a-default-cloud1-coll-meter-proxy-tls\") pod \"default-cloud1-coll-meter-smartgateway-7cd87f9766-m8n7t\" (UID: \"4de6f751-2471-4ce9-a771-00703e7be02a\") " pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-m8n7t" Mar 16 00:36:09 crc kubenswrapper[4816]: I0316 00:36:09.528310 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-cloud1-coll-meter-proxy-tls\" (UniqueName: \"kubernetes.io/secret/4de6f751-2471-4ce9-a771-00703e7be02a-default-cloud1-coll-meter-proxy-tls\") pod \"default-cloud1-coll-meter-smartgateway-7cd87f9766-m8n7t\" (UID: \"4de6f751-2471-4ce9-a771-00703e7be02a\") " pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-m8n7t" Mar 16 00:36:09 crc kubenswrapper[4816]: I0316 00:36:09.613535 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-m8n7t" Mar 16 00:36:09 crc kubenswrapper[4816]: I0316 00:36:09.678722 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="12bfc435-89c2-4917-9bb6-cc2e9eca440c" path="/var/lib/kubelet/pods/12bfc435-89c2-4917-9bb6-cc2e9eca440c/volumes" Mar 16 00:36:10 crc kubenswrapper[4816]: I0316 00:36:10.070958 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-gjt8v"] Mar 16 00:36:10 crc kubenswrapper[4816]: I0316 00:36:10.079498 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-gjt8v" Mar 16 00:36:10 crc kubenswrapper[4816]: I0316 00:36:10.088037 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"default-cloud1-ceil-meter-sg-core-configmap" Mar 16 00:36:10 crc kubenswrapper[4816]: I0316 00:36:10.088657 4816 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"default-cloud1-ceil-meter-proxy-tls" Mar 16 00:36:10 crc kubenswrapper[4816]: I0316 00:36:10.111806 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-gjt8v"] Mar 16 00:36:10 crc kubenswrapper[4816]: I0316 00:36:10.234091 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/empty-dir/a7c7b38e-dd7e-469c-ab38-173944ca2943-socket-dir\") pod \"default-cloud1-ceil-meter-smartgateway-57948895dc-gjt8v\" (UID: \"a7c7b38e-dd7e-469c-ab38-173944ca2943\") " pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-gjt8v" Mar 16 00:36:10 crc kubenswrapper[4816]: I0316 00:36:10.234350 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xk9q7\" (UniqueName: \"kubernetes.io/projected/a7c7b38e-dd7e-469c-ab38-173944ca2943-kube-api-access-xk9q7\") pod \"default-cloud1-ceil-meter-smartgateway-57948895dc-gjt8v\" (UID: \"a7c7b38e-dd7e-469c-ab38-173944ca2943\") " pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-gjt8v" Mar 16 00:36:10 crc kubenswrapper[4816]: I0316 00:36:10.234380 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-config\" (UniqueName: \"kubernetes.io/configmap/a7c7b38e-dd7e-469c-ab38-173944ca2943-sg-core-config\") pod \"default-cloud1-ceil-meter-smartgateway-57948895dc-gjt8v\" (UID: \"a7c7b38e-dd7e-469c-ab38-173944ca2943\") " pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-gjt8v" Mar 16 00:36:10 crc kubenswrapper[4816]: I0316 00:36:10.234445 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-cloud1-ceil-meter-proxy-tls\" (UniqueName: \"kubernetes.io/secret/a7c7b38e-dd7e-469c-ab38-173944ca2943-default-cloud1-ceil-meter-proxy-tls\") pod \"default-cloud1-ceil-meter-smartgateway-57948895dc-gjt8v\" (UID: \"a7c7b38e-dd7e-469c-ab38-173944ca2943\") " pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-gjt8v" Mar 16 00:36:10 crc kubenswrapper[4816]: I0316 00:36:10.234482 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"session-secret\" (UniqueName: \"kubernetes.io/secret/a7c7b38e-dd7e-469c-ab38-173944ca2943-session-secret\") pod \"default-cloud1-ceil-meter-smartgateway-57948895dc-gjt8v\" (UID: \"a7c7b38e-dd7e-469c-ab38-173944ca2943\") " pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-gjt8v" Mar 16 00:36:10 crc kubenswrapper[4816]: I0316 00:36:10.335743 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/empty-dir/a7c7b38e-dd7e-469c-ab38-173944ca2943-socket-dir\") pod \"default-cloud1-ceil-meter-smartgateway-57948895dc-gjt8v\" (UID: \"a7c7b38e-dd7e-469c-ab38-173944ca2943\") " pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-gjt8v" Mar 16 00:36:10 crc kubenswrapper[4816]: I0316 00:36:10.335785 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xk9q7\" (UniqueName: \"kubernetes.io/projected/a7c7b38e-dd7e-469c-ab38-173944ca2943-kube-api-access-xk9q7\") pod \"default-cloud1-ceil-meter-smartgateway-57948895dc-gjt8v\" (UID: \"a7c7b38e-dd7e-469c-ab38-173944ca2943\") " pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-gjt8v" Mar 16 00:36:10 crc kubenswrapper[4816]: I0316 00:36:10.335817 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-config\" (UniqueName: \"kubernetes.io/configmap/a7c7b38e-dd7e-469c-ab38-173944ca2943-sg-core-config\") pod \"default-cloud1-ceil-meter-smartgateway-57948895dc-gjt8v\" (UID: \"a7c7b38e-dd7e-469c-ab38-173944ca2943\") " pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-gjt8v" Mar 16 00:36:10 crc kubenswrapper[4816]: I0316 00:36:10.335884 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-cloud1-ceil-meter-proxy-tls\" (UniqueName: \"kubernetes.io/secret/a7c7b38e-dd7e-469c-ab38-173944ca2943-default-cloud1-ceil-meter-proxy-tls\") pod \"default-cloud1-ceil-meter-smartgateway-57948895dc-gjt8v\" (UID: \"a7c7b38e-dd7e-469c-ab38-173944ca2943\") " pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-gjt8v" Mar 16 00:36:10 crc kubenswrapper[4816]: I0316 00:36:10.335925 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"session-secret\" (UniqueName: \"kubernetes.io/secret/a7c7b38e-dd7e-469c-ab38-173944ca2943-session-secret\") pod \"default-cloud1-ceil-meter-smartgateway-57948895dc-gjt8v\" (UID: \"a7c7b38e-dd7e-469c-ab38-173944ca2943\") " pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-gjt8v" Mar 16 00:36:10 crc kubenswrapper[4816]: E0316 00:36:10.336226 4816 secret.go:188] Couldn't get secret service-telemetry/default-cloud1-ceil-meter-proxy-tls: secret "default-cloud1-ceil-meter-proxy-tls" not found Mar 16 00:36:10 crc kubenswrapper[4816]: E0316 00:36:10.336303 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a7c7b38e-dd7e-469c-ab38-173944ca2943-default-cloud1-ceil-meter-proxy-tls podName:a7c7b38e-dd7e-469c-ab38-173944ca2943 nodeName:}" failed. No retries permitted until 2026-03-16 00:36:10.836284717 +0000 UTC m=+1763.932584660 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "default-cloud1-ceil-meter-proxy-tls" (UniqueName: "kubernetes.io/secret/a7c7b38e-dd7e-469c-ab38-173944ca2943-default-cloud1-ceil-meter-proxy-tls") pod "default-cloud1-ceil-meter-smartgateway-57948895dc-gjt8v" (UID: "a7c7b38e-dd7e-469c-ab38-173944ca2943") : secret "default-cloud1-ceil-meter-proxy-tls" not found Mar 16 00:36:10 crc kubenswrapper[4816]: I0316 00:36:10.336886 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/empty-dir/a7c7b38e-dd7e-469c-ab38-173944ca2943-socket-dir\") pod \"default-cloud1-ceil-meter-smartgateway-57948895dc-gjt8v\" (UID: \"a7c7b38e-dd7e-469c-ab38-173944ca2943\") " pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-gjt8v" Mar 16 00:36:10 crc kubenswrapper[4816]: I0316 00:36:10.337223 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-config\" (UniqueName: \"kubernetes.io/configmap/a7c7b38e-dd7e-469c-ab38-173944ca2943-sg-core-config\") pod \"default-cloud1-ceil-meter-smartgateway-57948895dc-gjt8v\" (UID: \"a7c7b38e-dd7e-469c-ab38-173944ca2943\") " pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-gjt8v" Mar 16 00:36:10 crc kubenswrapper[4816]: I0316 00:36:10.348107 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"session-secret\" (UniqueName: \"kubernetes.io/secret/a7c7b38e-dd7e-469c-ab38-173944ca2943-session-secret\") pod \"default-cloud1-ceil-meter-smartgateway-57948895dc-gjt8v\" (UID: \"a7c7b38e-dd7e-469c-ab38-173944ca2943\") " pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-gjt8v" Mar 16 00:36:10 crc kubenswrapper[4816]: I0316 00:36:10.353322 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xk9q7\" (UniqueName: \"kubernetes.io/projected/a7c7b38e-dd7e-469c-ab38-173944ca2943-kube-api-access-xk9q7\") pod \"default-cloud1-ceil-meter-smartgateway-57948895dc-gjt8v\" (UID: \"a7c7b38e-dd7e-469c-ab38-173944ca2943\") " pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-gjt8v" Mar 16 00:36:10 crc kubenswrapper[4816]: I0316 00:36:10.851360 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-cloud1-ceil-meter-proxy-tls\" (UniqueName: \"kubernetes.io/secret/a7c7b38e-dd7e-469c-ab38-173944ca2943-default-cloud1-ceil-meter-proxy-tls\") pod \"default-cloud1-ceil-meter-smartgateway-57948895dc-gjt8v\" (UID: \"a7c7b38e-dd7e-469c-ab38-173944ca2943\") " pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-gjt8v" Mar 16 00:36:10 crc kubenswrapper[4816]: E0316 00:36:10.851575 4816 secret.go:188] Couldn't get secret service-telemetry/default-cloud1-ceil-meter-proxy-tls: secret "default-cloud1-ceil-meter-proxy-tls" not found Mar 16 00:36:10 crc kubenswrapper[4816]: E0316 00:36:10.851785 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a7c7b38e-dd7e-469c-ab38-173944ca2943-default-cloud1-ceil-meter-proxy-tls podName:a7c7b38e-dd7e-469c-ab38-173944ca2943 nodeName:}" failed. No retries permitted until 2026-03-16 00:36:11.851767802 +0000 UTC m=+1764.948067755 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "default-cloud1-ceil-meter-proxy-tls" (UniqueName: "kubernetes.io/secret/a7c7b38e-dd7e-469c-ab38-173944ca2943-default-cloud1-ceil-meter-proxy-tls") pod "default-cloud1-ceil-meter-smartgateway-57948895dc-gjt8v" (UID: "a7c7b38e-dd7e-469c-ab38-173944ca2943") : secret "default-cloud1-ceil-meter-proxy-tls" not found Mar 16 00:36:11 crc kubenswrapper[4816]: I0316 00:36:11.864333 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-cloud1-ceil-meter-proxy-tls\" (UniqueName: \"kubernetes.io/secret/a7c7b38e-dd7e-469c-ab38-173944ca2943-default-cloud1-ceil-meter-proxy-tls\") pod \"default-cloud1-ceil-meter-smartgateway-57948895dc-gjt8v\" (UID: \"a7c7b38e-dd7e-469c-ab38-173944ca2943\") " pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-gjt8v" Mar 16 00:36:11 crc kubenswrapper[4816]: I0316 00:36:11.871815 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-cloud1-ceil-meter-proxy-tls\" (UniqueName: \"kubernetes.io/secret/a7c7b38e-dd7e-469c-ab38-173944ca2943-default-cloud1-ceil-meter-proxy-tls\") pod \"default-cloud1-ceil-meter-smartgateway-57948895dc-gjt8v\" (UID: \"a7c7b38e-dd7e-469c-ab38-173944ca2943\") " pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-gjt8v" Mar 16 00:36:11 crc kubenswrapper[4816]: I0316 00:36:11.912473 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-gjt8v" Mar 16 00:36:12 crc kubenswrapper[4816]: I0316 00:36:12.414326 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-m8n7t"] Mar 16 00:36:12 crc kubenswrapper[4816]: I0316 00:36:12.474569 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-gjt8v"] Mar 16 00:36:13 crc kubenswrapper[4816]: W0316 00:36:13.003883 4816 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4de6f751_2471_4ce9_a771_00703e7be02a.slice/crio-5e33734c707e8f6159a1f1a62d31885cbef60b497c91cbefd8fbc54cc6cef8f3 WatchSource:0}: Error finding container 5e33734c707e8f6159a1f1a62d31885cbef60b497c91cbefd8fbc54cc6cef8f3: Status 404 returned error can't find the container with id 5e33734c707e8f6159a1f1a62d31885cbef60b497c91cbefd8fbc54cc6cef8f3 Mar 16 00:36:13 crc kubenswrapper[4816]: I0316 00:36:13.606293 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-lr826"] Mar 16 00:36:13 crc kubenswrapper[4816]: I0316 00:36:13.607744 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-lr826" Mar 16 00:36:13 crc kubenswrapper[4816]: I0316 00:36:13.613186 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"default-cloud1-sens-meter-sg-core-configmap" Mar 16 00:36:13 crc kubenswrapper[4816]: I0316 00:36:13.617111 4816 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"default-cloud1-sens-meter-proxy-tls" Mar 16 00:36:13 crc kubenswrapper[4816]: I0316 00:36:13.623176 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-lr826"] Mar 16 00:36:13 crc kubenswrapper[4816]: I0316 00:36:13.691459 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-m8n7t" event={"ID":"4de6f751-2471-4ce9-a771-00703e7be02a","Type":"ContainerStarted","Data":"5e33734c707e8f6159a1f1a62d31885cbef60b497c91cbefd8fbc54cc6cef8f3"} Mar 16 00:36:13 crc kubenswrapper[4816]: I0316 00:36:13.693396 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/alertmanager-default-0" event={"ID":"f4698c34-c93e-4d6f-8ab8-2bfcf3118410","Type":"ContainerStarted","Data":"59a7711ad7908a9dcaa1092ab168fd24e8dc87ebad32c5c5d835eb703fbf9df0"} Mar 16 00:36:13 crc kubenswrapper[4816]: I0316 00:36:13.694984 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-gjt8v" event={"ID":"a7c7b38e-dd7e-469c-ab38-173944ca2943","Type":"ContainerStarted","Data":"c6997e782546bb7636f289fcb9df20186bc0a72c1f6bdabbe999d35fdb7ee8b1"} Mar 16 00:36:13 crc kubenswrapper[4816]: I0316 00:36:13.697344 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/prometheus-default-0" event={"ID":"078376fd-a0f8-4157-8a07-23ce85695dc6","Type":"ContainerStarted","Data":"063d12ce8565ba6c45553a0e36f210c5d4eca03103480ebab524025fbbba362d"} Mar 16 00:36:13 crc kubenswrapper[4816]: I0316 00:36:13.727513 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/prometheus-default-0" podStartSLOduration=3.855844548 podStartE2EDuration="33.727495658s" podCreationTimestamp="2026-03-16 00:35:40 +0000 UTC" firstStartedPulling="2026-03-16 00:35:43.191757439 +0000 UTC m=+1736.288057392" lastFinishedPulling="2026-03-16 00:36:13.063408549 +0000 UTC m=+1766.159708502" observedRunningTime="2026-03-16 00:36:13.724326108 +0000 UTC m=+1766.820626061" watchObservedRunningTime="2026-03-16 00:36:13.727495658 +0000 UTC m=+1766.823795611" Mar 16 00:36:13 crc kubenswrapper[4816]: I0316 00:36:13.790908 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/empty-dir/ee673348-980c-44f5-8e33-71a859ce740c-socket-dir\") pod \"default-cloud1-sens-meter-smartgateway-5759b4d97-lr826\" (UID: \"ee673348-980c-44f5-8e33-71a859ce740c\") " pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-lr826" Mar 16 00:36:13 crc kubenswrapper[4816]: I0316 00:36:13.790976 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-frmkq\" (UniqueName: \"kubernetes.io/projected/ee673348-980c-44f5-8e33-71a859ce740c-kube-api-access-frmkq\") pod \"default-cloud1-sens-meter-smartgateway-5759b4d97-lr826\" (UID: \"ee673348-980c-44f5-8e33-71a859ce740c\") " pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-lr826" Mar 16 00:36:13 crc kubenswrapper[4816]: I0316 00:36:13.791046 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-cloud1-sens-meter-proxy-tls\" (UniqueName: \"kubernetes.io/secret/ee673348-980c-44f5-8e33-71a859ce740c-default-cloud1-sens-meter-proxy-tls\") pod \"default-cloud1-sens-meter-smartgateway-5759b4d97-lr826\" (UID: \"ee673348-980c-44f5-8e33-71a859ce740c\") " pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-lr826" Mar 16 00:36:13 crc kubenswrapper[4816]: I0316 00:36:13.791171 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-config\" (UniqueName: \"kubernetes.io/configmap/ee673348-980c-44f5-8e33-71a859ce740c-sg-core-config\") pod \"default-cloud1-sens-meter-smartgateway-5759b4d97-lr826\" (UID: \"ee673348-980c-44f5-8e33-71a859ce740c\") " pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-lr826" Mar 16 00:36:13 crc kubenswrapper[4816]: I0316 00:36:13.791240 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"session-secret\" (UniqueName: \"kubernetes.io/secret/ee673348-980c-44f5-8e33-71a859ce740c-session-secret\") pod \"default-cloud1-sens-meter-smartgateway-5759b4d97-lr826\" (UID: \"ee673348-980c-44f5-8e33-71a859ce740c\") " pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-lr826" Mar 16 00:36:13 crc kubenswrapper[4816]: I0316 00:36:13.892538 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-config\" (UniqueName: \"kubernetes.io/configmap/ee673348-980c-44f5-8e33-71a859ce740c-sg-core-config\") pod \"default-cloud1-sens-meter-smartgateway-5759b4d97-lr826\" (UID: \"ee673348-980c-44f5-8e33-71a859ce740c\") " pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-lr826" Mar 16 00:36:13 crc kubenswrapper[4816]: I0316 00:36:13.892624 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"session-secret\" (UniqueName: \"kubernetes.io/secret/ee673348-980c-44f5-8e33-71a859ce740c-session-secret\") pod \"default-cloud1-sens-meter-smartgateway-5759b4d97-lr826\" (UID: \"ee673348-980c-44f5-8e33-71a859ce740c\") " pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-lr826" Mar 16 00:36:13 crc kubenswrapper[4816]: I0316 00:36:13.892655 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/empty-dir/ee673348-980c-44f5-8e33-71a859ce740c-socket-dir\") pod \"default-cloud1-sens-meter-smartgateway-5759b4d97-lr826\" (UID: \"ee673348-980c-44f5-8e33-71a859ce740c\") " pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-lr826" Mar 16 00:36:13 crc kubenswrapper[4816]: I0316 00:36:13.892677 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-frmkq\" (UniqueName: \"kubernetes.io/projected/ee673348-980c-44f5-8e33-71a859ce740c-kube-api-access-frmkq\") pod \"default-cloud1-sens-meter-smartgateway-5759b4d97-lr826\" (UID: \"ee673348-980c-44f5-8e33-71a859ce740c\") " pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-lr826" Mar 16 00:36:13 crc kubenswrapper[4816]: I0316 00:36:13.892718 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-cloud1-sens-meter-proxy-tls\" (UniqueName: \"kubernetes.io/secret/ee673348-980c-44f5-8e33-71a859ce740c-default-cloud1-sens-meter-proxy-tls\") pod \"default-cloud1-sens-meter-smartgateway-5759b4d97-lr826\" (UID: \"ee673348-980c-44f5-8e33-71a859ce740c\") " pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-lr826" Mar 16 00:36:13 crc kubenswrapper[4816]: E0316 00:36:13.892835 4816 secret.go:188] Couldn't get secret service-telemetry/default-cloud1-sens-meter-proxy-tls: secret "default-cloud1-sens-meter-proxy-tls" not found Mar 16 00:36:13 crc kubenswrapper[4816]: E0316 00:36:13.892895 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ee673348-980c-44f5-8e33-71a859ce740c-default-cloud1-sens-meter-proxy-tls podName:ee673348-980c-44f5-8e33-71a859ce740c nodeName:}" failed. No retries permitted until 2026-03-16 00:36:14.392880345 +0000 UTC m=+1767.489180298 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "default-cloud1-sens-meter-proxy-tls" (UniqueName: "kubernetes.io/secret/ee673348-980c-44f5-8e33-71a859ce740c-default-cloud1-sens-meter-proxy-tls") pod "default-cloud1-sens-meter-smartgateway-5759b4d97-lr826" (UID: "ee673348-980c-44f5-8e33-71a859ce740c") : secret "default-cloud1-sens-meter-proxy-tls" not found Mar 16 00:36:13 crc kubenswrapper[4816]: I0316 00:36:13.893524 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/empty-dir/ee673348-980c-44f5-8e33-71a859ce740c-socket-dir\") pod \"default-cloud1-sens-meter-smartgateway-5759b4d97-lr826\" (UID: \"ee673348-980c-44f5-8e33-71a859ce740c\") " pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-lr826" Mar 16 00:36:13 crc kubenswrapper[4816]: I0316 00:36:13.896206 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-config\" (UniqueName: \"kubernetes.io/configmap/ee673348-980c-44f5-8e33-71a859ce740c-sg-core-config\") pod \"default-cloud1-sens-meter-smartgateway-5759b4d97-lr826\" (UID: \"ee673348-980c-44f5-8e33-71a859ce740c\") " pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-lr826" Mar 16 00:36:13 crc kubenswrapper[4816]: I0316 00:36:13.901141 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"session-secret\" (UniqueName: \"kubernetes.io/secret/ee673348-980c-44f5-8e33-71a859ce740c-session-secret\") pod \"default-cloud1-sens-meter-smartgateway-5759b4d97-lr826\" (UID: \"ee673348-980c-44f5-8e33-71a859ce740c\") " pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-lr826" Mar 16 00:36:13 crc kubenswrapper[4816]: I0316 00:36:13.913117 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-frmkq\" (UniqueName: \"kubernetes.io/projected/ee673348-980c-44f5-8e33-71a859ce740c-kube-api-access-frmkq\") pod \"default-cloud1-sens-meter-smartgateway-5759b4d97-lr826\" (UID: \"ee673348-980c-44f5-8e33-71a859ce740c\") " pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-lr826" Mar 16 00:36:14 crc kubenswrapper[4816]: I0316 00:36:14.400265 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-cloud1-sens-meter-proxy-tls\" (UniqueName: \"kubernetes.io/secret/ee673348-980c-44f5-8e33-71a859ce740c-default-cloud1-sens-meter-proxy-tls\") pod \"default-cloud1-sens-meter-smartgateway-5759b4d97-lr826\" (UID: \"ee673348-980c-44f5-8e33-71a859ce740c\") " pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-lr826" Mar 16 00:36:14 crc kubenswrapper[4816]: E0316 00:36:14.400486 4816 secret.go:188] Couldn't get secret service-telemetry/default-cloud1-sens-meter-proxy-tls: secret "default-cloud1-sens-meter-proxy-tls" not found Mar 16 00:36:14 crc kubenswrapper[4816]: E0316 00:36:14.400623 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ee673348-980c-44f5-8e33-71a859ce740c-default-cloud1-sens-meter-proxy-tls podName:ee673348-980c-44f5-8e33-71a859ce740c nodeName:}" failed. No retries permitted until 2026-03-16 00:36:15.400598101 +0000 UTC m=+1768.496898054 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "default-cloud1-sens-meter-proxy-tls" (UniqueName: "kubernetes.io/secret/ee673348-980c-44f5-8e33-71a859ce740c-default-cloud1-sens-meter-proxy-tls") pod "default-cloud1-sens-meter-smartgateway-5759b4d97-lr826" (UID: "ee673348-980c-44f5-8e33-71a859ce740c") : secret "default-cloud1-sens-meter-proxy-tls" not found Mar 16 00:36:14 crc kubenswrapper[4816]: I0316 00:36:14.709129 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-gjt8v" event={"ID":"a7c7b38e-dd7e-469c-ab38-173944ca2943","Type":"ContainerStarted","Data":"5e6adc7c1ceb4dee9d75da1b60f5314b5d489eb5abca26b0e1e9530c29a279af"} Mar 16 00:36:14 crc kubenswrapper[4816]: I0316 00:36:14.714636 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-m8n7t" event={"ID":"4de6f751-2471-4ce9-a771-00703e7be02a","Type":"ContainerStarted","Data":"48148e28d8fe141bbe5a156cd9d78d73b91d4be809f4c72cc046d7ef3f240d1f"} Mar 16 00:36:15 crc kubenswrapper[4816]: I0316 00:36:15.418171 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-cloud1-sens-meter-proxy-tls\" (UniqueName: \"kubernetes.io/secret/ee673348-980c-44f5-8e33-71a859ce740c-default-cloud1-sens-meter-proxy-tls\") pod \"default-cloud1-sens-meter-smartgateway-5759b4d97-lr826\" (UID: \"ee673348-980c-44f5-8e33-71a859ce740c\") " pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-lr826" Mar 16 00:36:15 crc kubenswrapper[4816]: I0316 00:36:15.424376 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-cloud1-sens-meter-proxy-tls\" (UniqueName: \"kubernetes.io/secret/ee673348-980c-44f5-8e33-71a859ce740c-default-cloud1-sens-meter-proxy-tls\") pod \"default-cloud1-sens-meter-smartgateway-5759b4d97-lr826\" (UID: \"ee673348-980c-44f5-8e33-71a859ce740c\") " pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-lr826" Mar 16 00:36:15 crc kubenswrapper[4816]: I0316 00:36:15.721338 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-lr826" Mar 16 00:36:15 crc kubenswrapper[4816]: I0316 00:36:15.725989 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-m8n7t" event={"ID":"4de6f751-2471-4ce9-a771-00703e7be02a","Type":"ContainerStarted","Data":"f9d1d4fbe1538ba3d5bfad270eb0c25d99d3990476fb2033e0544e3446cf984e"} Mar 16 00:36:15 crc kubenswrapper[4816]: I0316 00:36:15.728445 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/alertmanager-default-0" event={"ID":"f4698c34-c93e-4d6f-8ab8-2bfcf3118410","Type":"ContainerStarted","Data":"8588f16d5ec005f836154019cd40949f4d783d7ff7a353de6aa46aead0c27846"} Mar 16 00:36:15 crc kubenswrapper[4816]: I0316 00:36:15.730349 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-gjt8v" event={"ID":"a7c7b38e-dd7e-469c-ab38-173944ca2943","Type":"ContainerStarted","Data":"38d2258b8e0ac413732a26743c1c7812f0e79b91c6b5bd0ac251d5ae2c83223c"} Mar 16 00:36:16 crc kubenswrapper[4816]: I0316 00:36:16.225725 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-lr826"] Mar 16 00:36:16 crc kubenswrapper[4816]: W0316 00:36:16.238175 4816 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podee673348_980c_44f5_8e33_71a859ce740c.slice/crio-3b07e4f45a2e03a71adc0a786082d1632e61b6814483c597e3c68c14c7ad5d1c WatchSource:0}: Error finding container 3b07e4f45a2e03a71adc0a786082d1632e61b6814483c597e3c68c14c7ad5d1c: Status 404 returned error can't find the container with id 3b07e4f45a2e03a71adc0a786082d1632e61b6814483c597e3c68c14c7ad5d1c Mar 16 00:36:16 crc kubenswrapper[4816]: I0316 00:36:16.744173 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/alertmanager-default-0" event={"ID":"f4698c34-c93e-4d6f-8ab8-2bfcf3118410","Type":"ContainerStarted","Data":"4903f66673eb9543339774d9ac0280a8999d0232a84a23d8a55a030d79be9dbe"} Mar 16 00:36:16 crc kubenswrapper[4816]: I0316 00:36:16.745727 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-lr826" event={"ID":"ee673348-980c-44f5-8e33-71a859ce740c","Type":"ContainerStarted","Data":"3b07e4f45a2e03a71adc0a786082d1632e61b6814483c597e3c68c14c7ad5d1c"} Mar 16 00:36:16 crc kubenswrapper[4816]: I0316 00:36:16.771252 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/alertmanager-default-0" podStartSLOduration=15.579575346 podStartE2EDuration="23.771235254s" podCreationTimestamp="2026-03-16 00:35:53 +0000 UTC" firstStartedPulling="2026-03-16 00:36:07.655468051 +0000 UTC m=+1760.751768004" lastFinishedPulling="2026-03-16 00:36:15.847127959 +0000 UTC m=+1768.943427912" observedRunningTime="2026-03-16 00:36:16.765890654 +0000 UTC m=+1769.862190607" watchObservedRunningTime="2026-03-16 00:36:16.771235254 +0000 UTC m=+1769.867535207" Mar 16 00:36:17 crc kubenswrapper[4816]: I0316 00:36:17.755995 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-lr826" event={"ID":"ee673348-980c-44f5-8e33-71a859ce740c","Type":"ContainerStarted","Data":"cbce52285f31a47858a3995de5e0aeab1ce413b10f830c469df5d563447b72c3"} Mar 16 00:36:17 crc kubenswrapper[4816]: I0316 00:36:17.902164 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="service-telemetry/prometheus-default-0" Mar 16 00:36:20 crc kubenswrapper[4816]: I0316 00:36:20.881242 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/default-cloud1-coll-event-smartgateway-598dc6844-7wkmc"] Mar 16 00:36:20 crc kubenswrapper[4816]: I0316 00:36:20.882920 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/default-cloud1-coll-event-smartgateway-598dc6844-7wkmc" Mar 16 00:36:20 crc kubenswrapper[4816]: I0316 00:36:20.885584 4816 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"elasticsearch-es-cert" Mar 16 00:36:20 crc kubenswrapper[4816]: I0316 00:36:20.885828 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"default-cloud1-coll-event-sg-core-configmap" Mar 16 00:36:20 crc kubenswrapper[4816]: I0316 00:36:20.891319 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/default-cloud1-coll-event-smartgateway-598dc6844-7wkmc"] Mar 16 00:36:21 crc kubenswrapper[4816]: I0316 00:36:21.037520 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-config\" (UniqueName: \"kubernetes.io/configmap/71ca3af9-1a2f-4bd2-898a-13c9089b16c2-sg-core-config\") pod \"default-cloud1-coll-event-smartgateway-598dc6844-7wkmc\" (UID: \"71ca3af9-1a2f-4bd2-898a-13c9089b16c2\") " pod="service-telemetry/default-cloud1-coll-event-smartgateway-598dc6844-7wkmc" Mar 16 00:36:21 crc kubenswrapper[4816]: I0316 00:36:21.037599 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"elastic-certs\" (UniqueName: \"kubernetes.io/secret/71ca3af9-1a2f-4bd2-898a-13c9089b16c2-elastic-certs\") pod \"default-cloud1-coll-event-smartgateway-598dc6844-7wkmc\" (UID: \"71ca3af9-1a2f-4bd2-898a-13c9089b16c2\") " pod="service-telemetry/default-cloud1-coll-event-smartgateway-598dc6844-7wkmc" Mar 16 00:36:21 crc kubenswrapper[4816]: I0316 00:36:21.037624 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bcb4d\" (UniqueName: \"kubernetes.io/projected/71ca3af9-1a2f-4bd2-898a-13c9089b16c2-kube-api-access-bcb4d\") pod \"default-cloud1-coll-event-smartgateway-598dc6844-7wkmc\" (UID: \"71ca3af9-1a2f-4bd2-898a-13c9089b16c2\") " pod="service-telemetry/default-cloud1-coll-event-smartgateway-598dc6844-7wkmc" Mar 16 00:36:21 crc kubenswrapper[4816]: I0316 00:36:21.037663 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/empty-dir/71ca3af9-1a2f-4bd2-898a-13c9089b16c2-socket-dir\") pod \"default-cloud1-coll-event-smartgateway-598dc6844-7wkmc\" (UID: \"71ca3af9-1a2f-4bd2-898a-13c9089b16c2\") " pod="service-telemetry/default-cloud1-coll-event-smartgateway-598dc6844-7wkmc" Mar 16 00:36:21 crc kubenswrapper[4816]: I0316 00:36:21.139243 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/empty-dir/71ca3af9-1a2f-4bd2-898a-13c9089b16c2-socket-dir\") pod \"default-cloud1-coll-event-smartgateway-598dc6844-7wkmc\" (UID: \"71ca3af9-1a2f-4bd2-898a-13c9089b16c2\") " pod="service-telemetry/default-cloud1-coll-event-smartgateway-598dc6844-7wkmc" Mar 16 00:36:21 crc kubenswrapper[4816]: I0316 00:36:21.139925 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-config\" (UniqueName: \"kubernetes.io/configmap/71ca3af9-1a2f-4bd2-898a-13c9089b16c2-sg-core-config\") pod \"default-cloud1-coll-event-smartgateway-598dc6844-7wkmc\" (UID: \"71ca3af9-1a2f-4bd2-898a-13c9089b16c2\") " pod="service-telemetry/default-cloud1-coll-event-smartgateway-598dc6844-7wkmc" Mar 16 00:36:21 crc kubenswrapper[4816]: I0316 00:36:21.140086 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"elastic-certs\" (UniqueName: \"kubernetes.io/secret/71ca3af9-1a2f-4bd2-898a-13c9089b16c2-elastic-certs\") pod \"default-cloud1-coll-event-smartgateway-598dc6844-7wkmc\" (UID: \"71ca3af9-1a2f-4bd2-898a-13c9089b16c2\") " pod="service-telemetry/default-cloud1-coll-event-smartgateway-598dc6844-7wkmc" Mar 16 00:36:21 crc kubenswrapper[4816]: I0316 00:36:21.140211 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bcb4d\" (UniqueName: \"kubernetes.io/projected/71ca3af9-1a2f-4bd2-898a-13c9089b16c2-kube-api-access-bcb4d\") pod \"default-cloud1-coll-event-smartgateway-598dc6844-7wkmc\" (UID: \"71ca3af9-1a2f-4bd2-898a-13c9089b16c2\") " pod="service-telemetry/default-cloud1-coll-event-smartgateway-598dc6844-7wkmc" Mar 16 00:36:21 crc kubenswrapper[4816]: I0316 00:36:21.139705 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/empty-dir/71ca3af9-1a2f-4bd2-898a-13c9089b16c2-socket-dir\") pod \"default-cloud1-coll-event-smartgateway-598dc6844-7wkmc\" (UID: \"71ca3af9-1a2f-4bd2-898a-13c9089b16c2\") " pod="service-telemetry/default-cloud1-coll-event-smartgateway-598dc6844-7wkmc" Mar 16 00:36:21 crc kubenswrapper[4816]: I0316 00:36:21.140937 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-config\" (UniqueName: \"kubernetes.io/configmap/71ca3af9-1a2f-4bd2-898a-13c9089b16c2-sg-core-config\") pod \"default-cloud1-coll-event-smartgateway-598dc6844-7wkmc\" (UID: \"71ca3af9-1a2f-4bd2-898a-13c9089b16c2\") " pod="service-telemetry/default-cloud1-coll-event-smartgateway-598dc6844-7wkmc" Mar 16 00:36:21 crc kubenswrapper[4816]: I0316 00:36:21.155216 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"elastic-certs\" (UniqueName: \"kubernetes.io/secret/71ca3af9-1a2f-4bd2-898a-13c9089b16c2-elastic-certs\") pod \"default-cloud1-coll-event-smartgateway-598dc6844-7wkmc\" (UID: \"71ca3af9-1a2f-4bd2-898a-13c9089b16c2\") " pod="service-telemetry/default-cloud1-coll-event-smartgateway-598dc6844-7wkmc" Mar 16 00:36:21 crc kubenswrapper[4816]: I0316 00:36:21.164722 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bcb4d\" (UniqueName: \"kubernetes.io/projected/71ca3af9-1a2f-4bd2-898a-13c9089b16c2-kube-api-access-bcb4d\") pod \"default-cloud1-coll-event-smartgateway-598dc6844-7wkmc\" (UID: \"71ca3af9-1a2f-4bd2-898a-13c9089b16c2\") " pod="service-telemetry/default-cloud1-coll-event-smartgateway-598dc6844-7wkmc" Mar 16 00:36:21 crc kubenswrapper[4816]: I0316 00:36:21.261394 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/default-cloud1-coll-event-smartgateway-598dc6844-7wkmc" Mar 16 00:36:21 crc kubenswrapper[4816]: I0316 00:36:21.827904 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/default-cloud1-ceil-event-smartgateway-6cc4dc6457-55sts"] Mar 16 00:36:21 crc kubenswrapper[4816]: I0316 00:36:21.829455 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/default-cloud1-ceil-event-smartgateway-6cc4dc6457-55sts" Mar 16 00:36:21 crc kubenswrapper[4816]: I0316 00:36:21.831725 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"default-cloud1-ceil-event-sg-core-configmap" Mar 16 00:36:21 crc kubenswrapper[4816]: I0316 00:36:21.838626 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/default-cloud1-ceil-event-smartgateway-6cc4dc6457-55sts"] Mar 16 00:36:21 crc kubenswrapper[4816]: I0316 00:36:21.950933 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-config\" (UniqueName: \"kubernetes.io/configmap/134459cc-413b-4996-a0c1-aafe8dae8ebb-sg-core-config\") pod \"default-cloud1-ceil-event-smartgateway-6cc4dc6457-55sts\" (UID: \"134459cc-413b-4996-a0c1-aafe8dae8ebb\") " pod="service-telemetry/default-cloud1-ceil-event-smartgateway-6cc4dc6457-55sts" Mar 16 00:36:21 crc kubenswrapper[4816]: I0316 00:36:21.950999 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"elastic-certs\" (UniqueName: \"kubernetes.io/secret/134459cc-413b-4996-a0c1-aafe8dae8ebb-elastic-certs\") pod \"default-cloud1-ceil-event-smartgateway-6cc4dc6457-55sts\" (UID: \"134459cc-413b-4996-a0c1-aafe8dae8ebb\") " pod="service-telemetry/default-cloud1-ceil-event-smartgateway-6cc4dc6457-55sts" Mar 16 00:36:21 crc kubenswrapper[4816]: I0316 00:36:21.951043 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/empty-dir/134459cc-413b-4996-a0c1-aafe8dae8ebb-socket-dir\") pod \"default-cloud1-ceil-event-smartgateway-6cc4dc6457-55sts\" (UID: \"134459cc-413b-4996-a0c1-aafe8dae8ebb\") " pod="service-telemetry/default-cloud1-ceil-event-smartgateway-6cc4dc6457-55sts" Mar 16 00:36:21 crc kubenswrapper[4816]: I0316 00:36:21.951089 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nm5lr\" (UniqueName: \"kubernetes.io/projected/134459cc-413b-4996-a0c1-aafe8dae8ebb-kube-api-access-nm5lr\") pod \"default-cloud1-ceil-event-smartgateway-6cc4dc6457-55sts\" (UID: \"134459cc-413b-4996-a0c1-aafe8dae8ebb\") " pod="service-telemetry/default-cloud1-ceil-event-smartgateway-6cc4dc6457-55sts" Mar 16 00:36:22 crc kubenswrapper[4816]: I0316 00:36:22.052278 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-config\" (UniqueName: \"kubernetes.io/configmap/134459cc-413b-4996-a0c1-aafe8dae8ebb-sg-core-config\") pod \"default-cloud1-ceil-event-smartgateway-6cc4dc6457-55sts\" (UID: \"134459cc-413b-4996-a0c1-aafe8dae8ebb\") " pod="service-telemetry/default-cloud1-ceil-event-smartgateway-6cc4dc6457-55sts" Mar 16 00:36:22 crc kubenswrapper[4816]: I0316 00:36:22.052342 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"elastic-certs\" (UniqueName: \"kubernetes.io/secret/134459cc-413b-4996-a0c1-aafe8dae8ebb-elastic-certs\") pod \"default-cloud1-ceil-event-smartgateway-6cc4dc6457-55sts\" (UID: \"134459cc-413b-4996-a0c1-aafe8dae8ebb\") " pod="service-telemetry/default-cloud1-ceil-event-smartgateway-6cc4dc6457-55sts" Mar 16 00:36:22 crc kubenswrapper[4816]: I0316 00:36:22.052387 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/empty-dir/134459cc-413b-4996-a0c1-aafe8dae8ebb-socket-dir\") pod \"default-cloud1-ceil-event-smartgateway-6cc4dc6457-55sts\" (UID: \"134459cc-413b-4996-a0c1-aafe8dae8ebb\") " pod="service-telemetry/default-cloud1-ceil-event-smartgateway-6cc4dc6457-55sts" Mar 16 00:36:22 crc kubenswrapper[4816]: I0316 00:36:22.052434 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nm5lr\" (UniqueName: \"kubernetes.io/projected/134459cc-413b-4996-a0c1-aafe8dae8ebb-kube-api-access-nm5lr\") pod \"default-cloud1-ceil-event-smartgateway-6cc4dc6457-55sts\" (UID: \"134459cc-413b-4996-a0c1-aafe8dae8ebb\") " pod="service-telemetry/default-cloud1-ceil-event-smartgateway-6cc4dc6457-55sts" Mar 16 00:36:22 crc kubenswrapper[4816]: I0316 00:36:22.053134 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/empty-dir/134459cc-413b-4996-a0c1-aafe8dae8ebb-socket-dir\") pod \"default-cloud1-ceil-event-smartgateway-6cc4dc6457-55sts\" (UID: \"134459cc-413b-4996-a0c1-aafe8dae8ebb\") " pod="service-telemetry/default-cloud1-ceil-event-smartgateway-6cc4dc6457-55sts" Mar 16 00:36:22 crc kubenswrapper[4816]: I0316 00:36:22.053479 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-config\" (UniqueName: \"kubernetes.io/configmap/134459cc-413b-4996-a0c1-aafe8dae8ebb-sg-core-config\") pod \"default-cloud1-ceil-event-smartgateway-6cc4dc6457-55sts\" (UID: \"134459cc-413b-4996-a0c1-aafe8dae8ebb\") " pod="service-telemetry/default-cloud1-ceil-event-smartgateway-6cc4dc6457-55sts" Mar 16 00:36:22 crc kubenswrapper[4816]: I0316 00:36:22.055876 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"elastic-certs\" (UniqueName: \"kubernetes.io/secret/134459cc-413b-4996-a0c1-aafe8dae8ebb-elastic-certs\") pod \"default-cloud1-ceil-event-smartgateway-6cc4dc6457-55sts\" (UID: \"134459cc-413b-4996-a0c1-aafe8dae8ebb\") " pod="service-telemetry/default-cloud1-ceil-event-smartgateway-6cc4dc6457-55sts" Mar 16 00:36:22 crc kubenswrapper[4816]: I0316 00:36:22.071320 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nm5lr\" (UniqueName: \"kubernetes.io/projected/134459cc-413b-4996-a0c1-aafe8dae8ebb-kube-api-access-nm5lr\") pod \"default-cloud1-ceil-event-smartgateway-6cc4dc6457-55sts\" (UID: \"134459cc-413b-4996-a0c1-aafe8dae8ebb\") " pod="service-telemetry/default-cloud1-ceil-event-smartgateway-6cc4dc6457-55sts" Mar 16 00:36:22 crc kubenswrapper[4816]: I0316 00:36:22.153951 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/default-cloud1-ceil-event-smartgateway-6cc4dc6457-55sts" Mar 16 00:36:27 crc kubenswrapper[4816]: I0316 00:36:27.343515 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/default-cloud1-ceil-event-smartgateway-6cc4dc6457-55sts"] Mar 16 00:36:27 crc kubenswrapper[4816]: I0316 00:36:27.401711 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/default-cloud1-coll-event-smartgateway-598dc6844-7wkmc"] Mar 16 00:36:27 crc kubenswrapper[4816]: W0316 00:36:27.412132 4816 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod71ca3af9_1a2f_4bd2_898a_13c9089b16c2.slice/crio-bb68e8ff2fc4652bd7124872255f8c94e26e82f633c1abdf93bc90d074e3d742 WatchSource:0}: Error finding container bb68e8ff2fc4652bd7124872255f8c94e26e82f633c1abdf93bc90d074e3d742: Status 404 returned error can't find the container with id bb68e8ff2fc4652bd7124872255f8c94e26e82f633c1abdf93bc90d074e3d742 Mar 16 00:36:27 crc kubenswrapper[4816]: I0316 00:36:27.823475 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-gjt8v" event={"ID":"a7c7b38e-dd7e-469c-ab38-173944ca2943","Type":"ContainerStarted","Data":"1b2f537e39ddc44f71e65578b7748d4fdf3a00f92f7162e6c40e1d62de0821f7"} Mar 16 00:36:27 crc kubenswrapper[4816]: I0316 00:36:27.825264 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-coll-event-smartgateway-598dc6844-7wkmc" event={"ID":"71ca3af9-1a2f-4bd2-898a-13c9089b16c2","Type":"ContainerStarted","Data":"ce9e99a9c02686068162a7b07011e1f6804e66a5e36c18c8a59058d037d454f1"} Mar 16 00:36:27 crc kubenswrapper[4816]: I0316 00:36:27.825294 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-coll-event-smartgateway-598dc6844-7wkmc" event={"ID":"71ca3af9-1a2f-4bd2-898a-13c9089b16c2","Type":"ContainerStarted","Data":"bb68e8ff2fc4652bd7124872255f8c94e26e82f633c1abdf93bc90d074e3d742"} Mar 16 00:36:27 crc kubenswrapper[4816]: I0316 00:36:27.827733 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-lr826" event={"ID":"ee673348-980c-44f5-8e33-71a859ce740c","Type":"ContainerStarted","Data":"a93224c7f0c626d700e911445e6c6addd4f7c4e8be9b3890ffa81e6e0d5c2d7d"} Mar 16 00:36:27 crc kubenswrapper[4816]: I0316 00:36:27.827768 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-lr826" event={"ID":"ee673348-980c-44f5-8e33-71a859ce740c","Type":"ContainerStarted","Data":"4d8e0927fe2232f6e091b4714aa5409aa783badeda2f3341d66e596881120b91"} Mar 16 00:36:27 crc kubenswrapper[4816]: I0316 00:36:27.830282 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-m8n7t" event={"ID":"4de6f751-2471-4ce9-a771-00703e7be02a","Type":"ContainerStarted","Data":"7612b551538e59dc2650d3886c84b285d186f59576fce11885d1f9d53d3a74b2"} Mar 16 00:36:27 crc kubenswrapper[4816]: I0316 00:36:27.831711 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-ceil-event-smartgateway-6cc4dc6457-55sts" event={"ID":"134459cc-413b-4996-a0c1-aafe8dae8ebb","Type":"ContainerStarted","Data":"5d592ef65f51434a32e7ae05e9ef625d7d9825e6f8d4a65e1163ca86b1e07ca1"} Mar 16 00:36:27 crc kubenswrapper[4816]: I0316 00:36:27.831739 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-ceil-event-smartgateway-6cc4dc6457-55sts" event={"ID":"134459cc-413b-4996-a0c1-aafe8dae8ebb","Type":"ContainerStarted","Data":"48260c064384b04eabf949717c870cc6fe73c60cf45eba7e46ed0c8dc99d1dc5"} Mar 16 00:36:27 crc kubenswrapper[4816]: I0316 00:36:27.846207 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-gjt8v" podStartSLOduration=3.868825257 podStartE2EDuration="17.846187881s" podCreationTimestamp="2026-03-16 00:36:10 +0000 UTC" firstStartedPulling="2026-03-16 00:36:13.160460158 +0000 UTC m=+1766.256760111" lastFinishedPulling="2026-03-16 00:36:27.137822772 +0000 UTC m=+1780.234122735" observedRunningTime="2026-03-16 00:36:27.839295416 +0000 UTC m=+1780.935595369" watchObservedRunningTime="2026-03-16 00:36:27.846187881 +0000 UTC m=+1780.942487844" Mar 16 00:36:27 crc kubenswrapper[4816]: I0316 00:36:27.869857 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-m8n7t" podStartSLOduration=6.93558617 podStartE2EDuration="20.869838968s" podCreationTimestamp="2026-03-16 00:36:07 +0000 UTC" firstStartedPulling="2026-03-16 00:36:13.160798657 +0000 UTC m=+1766.257098610" lastFinishedPulling="2026-03-16 00:36:27.095051455 +0000 UTC m=+1780.191351408" observedRunningTime="2026-03-16 00:36:27.866413761 +0000 UTC m=+1780.962713714" watchObservedRunningTime="2026-03-16 00:36:27.869838968 +0000 UTC m=+1780.966138931" Mar 16 00:36:27 crc kubenswrapper[4816]: I0316 00:36:27.900063 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-lr826" podStartSLOduration=3.6838836390000003 podStartE2EDuration="14.90003608s" podCreationTimestamp="2026-03-16 00:36:13 +0000 UTC" firstStartedPulling="2026-03-16 00:36:16.2426962 +0000 UTC m=+1769.338996163" lastFinishedPulling="2026-03-16 00:36:27.458848651 +0000 UTC m=+1780.555148604" observedRunningTime="2026-03-16 00:36:27.894635557 +0000 UTC m=+1780.990935510" watchObservedRunningTime="2026-03-16 00:36:27.90003608 +0000 UTC m=+1780.996336033" Mar 16 00:36:27 crc kubenswrapper[4816]: I0316 00:36:27.903527 4816 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="service-telemetry/prometheus-default-0" Mar 16 00:36:27 crc kubenswrapper[4816]: I0316 00:36:27.954740 4816 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="service-telemetry/prometheus-default-0" Mar 16 00:36:28 crc kubenswrapper[4816]: I0316 00:36:28.840718 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-coll-event-smartgateway-598dc6844-7wkmc" event={"ID":"71ca3af9-1a2f-4bd2-898a-13c9089b16c2","Type":"ContainerStarted","Data":"15495ca1215534138ba301daf0e5d9775679806317dd191e3f541ef7002473a8"} Mar 16 00:36:28 crc kubenswrapper[4816]: I0316 00:36:28.843118 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-ceil-event-smartgateway-6cc4dc6457-55sts" event={"ID":"134459cc-413b-4996-a0c1-aafe8dae8ebb","Type":"ContainerStarted","Data":"3b42bffd699bcd4c96385d701e9f370ec5fcc2d09a04c176fa9fda031519a60d"} Mar 16 00:36:28 crc kubenswrapper[4816]: I0316 00:36:28.872912 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/default-cloud1-ceil-event-smartgateway-6cc4dc6457-55sts" podStartSLOduration=7.48546219 podStartE2EDuration="7.872888382s" podCreationTimestamp="2026-03-16 00:36:21 +0000 UTC" firstStartedPulling="2026-03-16 00:36:27.352467169 +0000 UTC m=+1780.448767122" lastFinishedPulling="2026-03-16 00:36:27.739893361 +0000 UTC m=+1780.836193314" observedRunningTime="2026-03-16 00:36:28.871390409 +0000 UTC m=+1781.967690402" watchObservedRunningTime="2026-03-16 00:36:28.872888382 +0000 UTC m=+1781.969188375" Mar 16 00:36:28 crc kubenswrapper[4816]: I0316 00:36:28.877511 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/default-cloud1-coll-event-smartgateway-598dc6844-7wkmc" podStartSLOduration=8.554112367 podStartE2EDuration="8.877495232s" podCreationTimestamp="2026-03-16 00:36:20 +0000 UTC" firstStartedPulling="2026-03-16 00:36:27.4155761 +0000 UTC m=+1780.511876053" lastFinishedPulling="2026-03-16 00:36:27.738958965 +0000 UTC m=+1780.835258918" observedRunningTime="2026-03-16 00:36:28.856028636 +0000 UTC m=+1781.952328609" watchObservedRunningTime="2026-03-16 00:36:28.877495232 +0000 UTC m=+1781.973795215" Mar 16 00:36:28 crc kubenswrapper[4816]: I0316 00:36:28.894235 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="service-telemetry/prometheus-default-0" Mar 16 00:36:31 crc kubenswrapper[4816]: I0316 00:36:31.862958 4816 patch_prober.go:28] interesting pod/machine-config-daemon-jrdcz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 16 00:36:31 crc kubenswrapper[4816]: I0316 00:36:31.863473 4816 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jrdcz" podUID="dd08ece2-7636-4966-973a-e96a34b70b53" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 16 00:36:31 crc kubenswrapper[4816]: I0316 00:36:31.863517 4816 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-jrdcz" Mar 16 00:36:31 crc kubenswrapper[4816]: I0316 00:36:31.864114 4816 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"ad2d141490ffe2b1f91c3a2296efeb65e936382d1300bb9398efe83e779ccd8a"} pod="openshift-machine-config-operator/machine-config-daemon-jrdcz" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 16 00:36:31 crc kubenswrapper[4816]: I0316 00:36:31.864156 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-jrdcz" podUID="dd08ece2-7636-4966-973a-e96a34b70b53" containerName="machine-config-daemon" containerID="cri-o://ad2d141490ffe2b1f91c3a2296efeb65e936382d1300bb9398efe83e779ccd8a" gracePeriod=600 Mar 16 00:36:32 crc kubenswrapper[4816]: E0316 00:36:32.001298 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jrdcz_openshift-machine-config-operator(dd08ece2-7636-4966-973a-e96a34b70b53)\"" pod="openshift-machine-config-operator/machine-config-daemon-jrdcz" podUID="dd08ece2-7636-4966-973a-e96a34b70b53" Mar 16 00:36:32 crc kubenswrapper[4816]: I0316 00:36:32.880318 4816 generic.go:334] "Generic (PLEG): container finished" podID="dd08ece2-7636-4966-973a-e96a34b70b53" containerID="ad2d141490ffe2b1f91c3a2296efeb65e936382d1300bb9398efe83e779ccd8a" exitCode=0 Mar 16 00:36:32 crc kubenswrapper[4816]: I0316 00:36:32.880408 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-jrdcz" event={"ID":"dd08ece2-7636-4966-973a-e96a34b70b53","Type":"ContainerDied","Data":"ad2d141490ffe2b1f91c3a2296efeb65e936382d1300bb9398efe83e779ccd8a"} Mar 16 00:36:32 crc kubenswrapper[4816]: I0316 00:36:32.880690 4816 scope.go:117] "RemoveContainer" containerID="92fd160da980a35a692640a98195800839c4f80b2447586e89c2230217ad0071" Mar 16 00:36:32 crc kubenswrapper[4816]: I0316 00:36:32.881644 4816 scope.go:117] "RemoveContainer" containerID="ad2d141490ffe2b1f91c3a2296efeb65e936382d1300bb9398efe83e779ccd8a" Mar 16 00:36:32 crc kubenswrapper[4816]: E0316 00:36:32.881941 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jrdcz_openshift-machine-config-operator(dd08ece2-7636-4966-973a-e96a34b70b53)\"" pod="openshift-machine-config-operator/machine-config-daemon-jrdcz" podUID="dd08ece2-7636-4966-973a-e96a34b70b53" Mar 16 00:36:33 crc kubenswrapper[4816]: I0316 00:36:33.972831 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["service-telemetry/default-interconnect-68864d46cb-zdlgx"] Mar 16 00:36:33 crc kubenswrapper[4816]: I0316 00:36:33.973096 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="service-telemetry/default-interconnect-68864d46cb-zdlgx" podUID="91573536-f8d4-475f-bfb6-dd2ad9910ce0" containerName="default-interconnect" containerID="cri-o://b845ba99f7acb2f04acbd6a1b07af235c4d1d04bb6986788b5b33868b6ffa6d3" gracePeriod=30 Mar 16 00:36:34 crc kubenswrapper[4816]: I0316 00:36:34.338804 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/default-interconnect-68864d46cb-zdlgx" Mar 16 00:36:34 crc kubenswrapper[4816]: I0316 00:36:34.449877 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"default-interconnect-inter-router-credentials\" (UniqueName: \"kubernetes.io/secret/91573536-f8d4-475f-bfb6-dd2ad9910ce0-default-interconnect-inter-router-credentials\") pod \"91573536-f8d4-475f-bfb6-dd2ad9910ce0\" (UID: \"91573536-f8d4-475f-bfb6-dd2ad9910ce0\") " Mar 16 00:36:34 crc kubenswrapper[4816]: I0316 00:36:34.449980 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sasl-config\" (UniqueName: \"kubernetes.io/configmap/91573536-f8d4-475f-bfb6-dd2ad9910ce0-sasl-config\") pod \"91573536-f8d4-475f-bfb6-dd2ad9910ce0\" (UID: \"91573536-f8d4-475f-bfb6-dd2ad9910ce0\") " Mar 16 00:36:34 crc kubenswrapper[4816]: I0316 00:36:34.450025 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"default-interconnect-inter-router-ca\" (UniqueName: \"kubernetes.io/secret/91573536-f8d4-475f-bfb6-dd2ad9910ce0-default-interconnect-inter-router-ca\") pod \"91573536-f8d4-475f-bfb6-dd2ad9910ce0\" (UID: \"91573536-f8d4-475f-bfb6-dd2ad9910ce0\") " Mar 16 00:36:34 crc kubenswrapper[4816]: I0316 00:36:34.450131 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sasl-users\" (UniqueName: \"kubernetes.io/secret/91573536-f8d4-475f-bfb6-dd2ad9910ce0-sasl-users\") pod \"91573536-f8d4-475f-bfb6-dd2ad9910ce0\" (UID: \"91573536-f8d4-475f-bfb6-dd2ad9910ce0\") " Mar 16 00:36:34 crc kubenswrapper[4816]: I0316 00:36:34.450185 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"default-interconnect-openstack-credentials\" (UniqueName: \"kubernetes.io/secret/91573536-f8d4-475f-bfb6-dd2ad9910ce0-default-interconnect-openstack-credentials\") pod \"91573536-f8d4-475f-bfb6-dd2ad9910ce0\" (UID: \"91573536-f8d4-475f-bfb6-dd2ad9910ce0\") " Mar 16 00:36:34 crc kubenswrapper[4816]: I0316 00:36:34.450215 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mth5q\" (UniqueName: \"kubernetes.io/projected/91573536-f8d4-475f-bfb6-dd2ad9910ce0-kube-api-access-mth5q\") pod \"91573536-f8d4-475f-bfb6-dd2ad9910ce0\" (UID: \"91573536-f8d4-475f-bfb6-dd2ad9910ce0\") " Mar 16 00:36:34 crc kubenswrapper[4816]: I0316 00:36:34.450250 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"default-interconnect-openstack-ca\" (UniqueName: \"kubernetes.io/secret/91573536-f8d4-475f-bfb6-dd2ad9910ce0-default-interconnect-openstack-ca\") pod \"91573536-f8d4-475f-bfb6-dd2ad9910ce0\" (UID: \"91573536-f8d4-475f-bfb6-dd2ad9910ce0\") " Mar 16 00:36:34 crc kubenswrapper[4816]: I0316 00:36:34.450769 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/91573536-f8d4-475f-bfb6-dd2ad9910ce0-sasl-config" (OuterVolumeSpecName: "sasl-config") pod "91573536-f8d4-475f-bfb6-dd2ad9910ce0" (UID: "91573536-f8d4-475f-bfb6-dd2ad9910ce0"). InnerVolumeSpecName "sasl-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:36:34 crc kubenswrapper[4816]: I0316 00:36:34.451398 4816 reconciler_common.go:293] "Volume detached for volume \"sasl-config\" (UniqueName: \"kubernetes.io/configmap/91573536-f8d4-475f-bfb6-dd2ad9910ce0-sasl-config\") on node \"crc\" DevicePath \"\"" Mar 16 00:36:34 crc kubenswrapper[4816]: I0316 00:36:34.455652 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/91573536-f8d4-475f-bfb6-dd2ad9910ce0-kube-api-access-mth5q" (OuterVolumeSpecName: "kube-api-access-mth5q") pod "91573536-f8d4-475f-bfb6-dd2ad9910ce0" (UID: "91573536-f8d4-475f-bfb6-dd2ad9910ce0"). InnerVolumeSpecName "kube-api-access-mth5q". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:36:34 crc kubenswrapper[4816]: I0316 00:36:34.455709 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/91573536-f8d4-475f-bfb6-dd2ad9910ce0-default-interconnect-inter-router-credentials" (OuterVolumeSpecName: "default-interconnect-inter-router-credentials") pod "91573536-f8d4-475f-bfb6-dd2ad9910ce0" (UID: "91573536-f8d4-475f-bfb6-dd2ad9910ce0"). InnerVolumeSpecName "default-interconnect-inter-router-credentials". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 16 00:36:34 crc kubenswrapper[4816]: I0316 00:36:34.456537 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/91573536-f8d4-475f-bfb6-dd2ad9910ce0-default-interconnect-openstack-ca" (OuterVolumeSpecName: "default-interconnect-openstack-ca") pod "91573536-f8d4-475f-bfb6-dd2ad9910ce0" (UID: "91573536-f8d4-475f-bfb6-dd2ad9910ce0"). InnerVolumeSpecName "default-interconnect-openstack-ca". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 16 00:36:34 crc kubenswrapper[4816]: I0316 00:36:34.459708 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/91573536-f8d4-475f-bfb6-dd2ad9910ce0-default-interconnect-openstack-credentials" (OuterVolumeSpecName: "default-interconnect-openstack-credentials") pod "91573536-f8d4-475f-bfb6-dd2ad9910ce0" (UID: "91573536-f8d4-475f-bfb6-dd2ad9910ce0"). InnerVolumeSpecName "default-interconnect-openstack-credentials". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 16 00:36:34 crc kubenswrapper[4816]: I0316 00:36:34.459950 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/91573536-f8d4-475f-bfb6-dd2ad9910ce0-default-interconnect-inter-router-ca" (OuterVolumeSpecName: "default-interconnect-inter-router-ca") pod "91573536-f8d4-475f-bfb6-dd2ad9910ce0" (UID: "91573536-f8d4-475f-bfb6-dd2ad9910ce0"). InnerVolumeSpecName "default-interconnect-inter-router-ca". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 16 00:36:34 crc kubenswrapper[4816]: I0316 00:36:34.461662 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/91573536-f8d4-475f-bfb6-dd2ad9910ce0-sasl-users" (OuterVolumeSpecName: "sasl-users") pod "91573536-f8d4-475f-bfb6-dd2ad9910ce0" (UID: "91573536-f8d4-475f-bfb6-dd2ad9910ce0"). InnerVolumeSpecName "sasl-users". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 16 00:36:34 crc kubenswrapper[4816]: I0316 00:36:34.552611 4816 reconciler_common.go:293] "Volume detached for volume \"default-interconnect-inter-router-credentials\" (UniqueName: \"kubernetes.io/secret/91573536-f8d4-475f-bfb6-dd2ad9910ce0-default-interconnect-inter-router-credentials\") on node \"crc\" DevicePath \"\"" Mar 16 00:36:34 crc kubenswrapper[4816]: I0316 00:36:34.552648 4816 reconciler_common.go:293] "Volume detached for volume \"default-interconnect-inter-router-ca\" (UniqueName: \"kubernetes.io/secret/91573536-f8d4-475f-bfb6-dd2ad9910ce0-default-interconnect-inter-router-ca\") on node \"crc\" DevicePath \"\"" Mar 16 00:36:34 crc kubenswrapper[4816]: I0316 00:36:34.552661 4816 reconciler_common.go:293] "Volume detached for volume \"sasl-users\" (UniqueName: \"kubernetes.io/secret/91573536-f8d4-475f-bfb6-dd2ad9910ce0-sasl-users\") on node \"crc\" DevicePath \"\"" Mar 16 00:36:34 crc kubenswrapper[4816]: I0316 00:36:34.552672 4816 reconciler_common.go:293] "Volume detached for volume \"default-interconnect-openstack-credentials\" (UniqueName: \"kubernetes.io/secret/91573536-f8d4-475f-bfb6-dd2ad9910ce0-default-interconnect-openstack-credentials\") on node \"crc\" DevicePath \"\"" Mar 16 00:36:34 crc kubenswrapper[4816]: I0316 00:36:34.552683 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mth5q\" (UniqueName: \"kubernetes.io/projected/91573536-f8d4-475f-bfb6-dd2ad9910ce0-kube-api-access-mth5q\") on node \"crc\" DevicePath \"\"" Mar 16 00:36:34 crc kubenswrapper[4816]: I0316 00:36:34.552692 4816 reconciler_common.go:293] "Volume detached for volume \"default-interconnect-openstack-ca\" (UniqueName: \"kubernetes.io/secret/91573536-f8d4-475f-bfb6-dd2ad9910ce0-default-interconnect-openstack-ca\") on node \"crc\" DevicePath \"\"" Mar 16 00:36:34 crc kubenswrapper[4816]: I0316 00:36:34.894353 4816 generic.go:334] "Generic (PLEG): container finished" podID="91573536-f8d4-475f-bfb6-dd2ad9910ce0" containerID="b845ba99f7acb2f04acbd6a1b07af235c4d1d04bb6986788b5b33868b6ffa6d3" exitCode=0 Mar 16 00:36:34 crc kubenswrapper[4816]: I0316 00:36:34.894403 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-interconnect-68864d46cb-zdlgx" event={"ID":"91573536-f8d4-475f-bfb6-dd2ad9910ce0","Type":"ContainerDied","Data":"b845ba99f7acb2f04acbd6a1b07af235c4d1d04bb6986788b5b33868b6ffa6d3"} Mar 16 00:36:34 crc kubenswrapper[4816]: I0316 00:36:34.894717 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-interconnect-68864d46cb-zdlgx" event={"ID":"91573536-f8d4-475f-bfb6-dd2ad9910ce0","Type":"ContainerDied","Data":"26dc3f4ee67e3496e2e4a3275b5368fe7536b3be135627eecf62f5d01c3d1d56"} Mar 16 00:36:34 crc kubenswrapper[4816]: I0316 00:36:34.894739 4816 scope.go:117] "RemoveContainer" containerID="b845ba99f7acb2f04acbd6a1b07af235c4d1d04bb6986788b5b33868b6ffa6d3" Mar 16 00:36:34 crc kubenswrapper[4816]: I0316 00:36:34.894434 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/default-interconnect-68864d46cb-zdlgx" Mar 16 00:36:34 crc kubenswrapper[4816]: I0316 00:36:34.902413 4816 generic.go:334] "Generic (PLEG): container finished" podID="4de6f751-2471-4ce9-a771-00703e7be02a" containerID="f9d1d4fbe1538ba3d5bfad270eb0c25d99d3990476fb2033e0544e3446cf984e" exitCode=0 Mar 16 00:36:34 crc kubenswrapper[4816]: I0316 00:36:34.902500 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-m8n7t" event={"ID":"4de6f751-2471-4ce9-a771-00703e7be02a","Type":"ContainerDied","Data":"f9d1d4fbe1538ba3d5bfad270eb0c25d99d3990476fb2033e0544e3446cf984e"} Mar 16 00:36:34 crc kubenswrapper[4816]: I0316 00:36:34.903009 4816 scope.go:117] "RemoveContainer" containerID="f9d1d4fbe1538ba3d5bfad270eb0c25d99d3990476fb2033e0544e3446cf984e" Mar 16 00:36:34 crc kubenswrapper[4816]: I0316 00:36:34.905173 4816 generic.go:334] "Generic (PLEG): container finished" podID="134459cc-413b-4996-a0c1-aafe8dae8ebb" containerID="5d592ef65f51434a32e7ae05e9ef625d7d9825e6f8d4a65e1163ca86b1e07ca1" exitCode=0 Mar 16 00:36:34 crc kubenswrapper[4816]: I0316 00:36:34.905230 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-ceil-event-smartgateway-6cc4dc6457-55sts" event={"ID":"134459cc-413b-4996-a0c1-aafe8dae8ebb","Type":"ContainerDied","Data":"5d592ef65f51434a32e7ae05e9ef625d7d9825e6f8d4a65e1163ca86b1e07ca1"} Mar 16 00:36:34 crc kubenswrapper[4816]: I0316 00:36:34.913145 4816 scope.go:117] "RemoveContainer" containerID="5d592ef65f51434a32e7ae05e9ef625d7d9825e6f8d4a65e1163ca86b1e07ca1" Mar 16 00:36:34 crc kubenswrapper[4816]: I0316 00:36:34.917801 4816 scope.go:117] "RemoveContainer" containerID="b845ba99f7acb2f04acbd6a1b07af235c4d1d04bb6986788b5b33868b6ffa6d3" Mar 16 00:36:34 crc kubenswrapper[4816]: E0316 00:36:34.923680 4816 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b845ba99f7acb2f04acbd6a1b07af235c4d1d04bb6986788b5b33868b6ffa6d3\": container with ID starting with b845ba99f7acb2f04acbd6a1b07af235c4d1d04bb6986788b5b33868b6ffa6d3 not found: ID does not exist" containerID="b845ba99f7acb2f04acbd6a1b07af235c4d1d04bb6986788b5b33868b6ffa6d3" Mar 16 00:36:34 crc kubenswrapper[4816]: I0316 00:36:34.923746 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b845ba99f7acb2f04acbd6a1b07af235c4d1d04bb6986788b5b33868b6ffa6d3"} err="failed to get container status \"b845ba99f7acb2f04acbd6a1b07af235c4d1d04bb6986788b5b33868b6ffa6d3\": rpc error: code = NotFound desc = could not find container \"b845ba99f7acb2f04acbd6a1b07af235c4d1d04bb6986788b5b33868b6ffa6d3\": container with ID starting with b845ba99f7acb2f04acbd6a1b07af235c4d1d04bb6986788b5b33868b6ffa6d3 not found: ID does not exist" Mar 16 00:36:34 crc kubenswrapper[4816]: I0316 00:36:34.925261 4816 generic.go:334] "Generic (PLEG): container finished" podID="a7c7b38e-dd7e-469c-ab38-173944ca2943" containerID="38d2258b8e0ac413732a26743c1c7812f0e79b91c6b5bd0ac251d5ae2c83223c" exitCode=0 Mar 16 00:36:34 crc kubenswrapper[4816]: I0316 00:36:34.925355 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-gjt8v" event={"ID":"a7c7b38e-dd7e-469c-ab38-173944ca2943","Type":"ContainerDied","Data":"38d2258b8e0ac413732a26743c1c7812f0e79b91c6b5bd0ac251d5ae2c83223c"} Mar 16 00:36:34 crc kubenswrapper[4816]: I0316 00:36:34.925884 4816 scope.go:117] "RemoveContainer" containerID="38d2258b8e0ac413732a26743c1c7812f0e79b91c6b5bd0ac251d5ae2c83223c" Mar 16 00:36:34 crc kubenswrapper[4816]: I0316 00:36:34.928970 4816 generic.go:334] "Generic (PLEG): container finished" podID="71ca3af9-1a2f-4bd2-898a-13c9089b16c2" containerID="ce9e99a9c02686068162a7b07011e1f6804e66a5e36c18c8a59058d037d454f1" exitCode=0 Mar 16 00:36:34 crc kubenswrapper[4816]: I0316 00:36:34.929065 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-coll-event-smartgateway-598dc6844-7wkmc" event={"ID":"71ca3af9-1a2f-4bd2-898a-13c9089b16c2","Type":"ContainerDied","Data":"ce9e99a9c02686068162a7b07011e1f6804e66a5e36c18c8a59058d037d454f1"} Mar 16 00:36:34 crc kubenswrapper[4816]: I0316 00:36:34.929644 4816 scope.go:117] "RemoveContainer" containerID="ce9e99a9c02686068162a7b07011e1f6804e66a5e36c18c8a59058d037d454f1" Mar 16 00:36:34 crc kubenswrapper[4816]: I0316 00:36:34.968177 4816 generic.go:334] "Generic (PLEG): container finished" podID="ee673348-980c-44f5-8e33-71a859ce740c" containerID="4d8e0927fe2232f6e091b4714aa5409aa783badeda2f3341d66e596881120b91" exitCode=0 Mar 16 00:36:34 crc kubenswrapper[4816]: I0316 00:36:34.968222 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-lr826" event={"ID":"ee673348-980c-44f5-8e33-71a859ce740c","Type":"ContainerDied","Data":"4d8e0927fe2232f6e091b4714aa5409aa783badeda2f3341d66e596881120b91"} Mar 16 00:36:34 crc kubenswrapper[4816]: I0316 00:36:34.968741 4816 scope.go:117] "RemoveContainer" containerID="4d8e0927fe2232f6e091b4714aa5409aa783badeda2f3341d66e596881120b91" Mar 16 00:36:34 crc kubenswrapper[4816]: I0316 00:36:34.995610 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["service-telemetry/default-interconnect-68864d46cb-zdlgx"] Mar 16 00:36:35 crc kubenswrapper[4816]: I0316 00:36:35.009074 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["service-telemetry/default-interconnect-68864d46cb-zdlgx"] Mar 16 00:36:35 crc kubenswrapper[4816]: I0316 00:36:35.638391 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/default-interconnect-68864d46cb-pdx9k"] Mar 16 00:36:35 crc kubenswrapper[4816]: E0316 00:36:35.638976 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="91573536-f8d4-475f-bfb6-dd2ad9910ce0" containerName="default-interconnect" Mar 16 00:36:35 crc kubenswrapper[4816]: I0316 00:36:35.638989 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="91573536-f8d4-475f-bfb6-dd2ad9910ce0" containerName="default-interconnect" Mar 16 00:36:35 crc kubenswrapper[4816]: I0316 00:36:35.639096 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="91573536-f8d4-475f-bfb6-dd2ad9910ce0" containerName="default-interconnect" Mar 16 00:36:35 crc kubenswrapper[4816]: I0316 00:36:35.639540 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/default-interconnect-68864d46cb-pdx9k" Mar 16 00:36:35 crc kubenswrapper[4816]: I0316 00:36:35.641688 4816 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"default-interconnect-inter-router-credentials" Mar 16 00:36:35 crc kubenswrapper[4816]: I0316 00:36:35.641842 4816 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"default-interconnect-inter-router-ca" Mar 16 00:36:35 crc kubenswrapper[4816]: I0316 00:36:35.641875 4816 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"default-interconnect-openstack-ca" Mar 16 00:36:35 crc kubenswrapper[4816]: I0316 00:36:35.642013 4816 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"default-interconnect-openstack-credentials" Mar 16 00:36:35 crc kubenswrapper[4816]: I0316 00:36:35.642573 4816 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"default-interconnect-dockercfg-j42x2" Mar 16 00:36:35 crc kubenswrapper[4816]: I0316 00:36:35.642828 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"default-interconnect-sasl-config" Mar 16 00:36:35 crc kubenswrapper[4816]: I0316 00:36:35.643271 4816 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"default-interconnect-users" Mar 16 00:36:35 crc kubenswrapper[4816]: I0316 00:36:35.699103 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="91573536-f8d4-475f-bfb6-dd2ad9910ce0" path="/var/lib/kubelet/pods/91573536-f8d4-475f-bfb6-dd2ad9910ce0/volumes" Mar 16 00:36:35 crc kubenswrapper[4816]: I0316 00:36:35.699809 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/default-interconnect-68864d46cb-pdx9k"] Mar 16 00:36:35 crc kubenswrapper[4816]: I0316 00:36:35.771931 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-interconnect-inter-router-credentials\" (UniqueName: \"kubernetes.io/secret/99cc3f5f-4a76-4ef9-9001-dda6329b3fe0-default-interconnect-inter-router-credentials\") pod \"default-interconnect-68864d46cb-pdx9k\" (UID: \"99cc3f5f-4a76-4ef9-9001-dda6329b3fe0\") " pod="service-telemetry/default-interconnect-68864d46cb-pdx9k" Mar 16 00:36:35 crc kubenswrapper[4816]: I0316 00:36:35.771975 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h9x2x\" (UniqueName: \"kubernetes.io/projected/99cc3f5f-4a76-4ef9-9001-dda6329b3fe0-kube-api-access-h9x2x\") pod \"default-interconnect-68864d46cb-pdx9k\" (UID: \"99cc3f5f-4a76-4ef9-9001-dda6329b3fe0\") " pod="service-telemetry/default-interconnect-68864d46cb-pdx9k" Mar 16 00:36:35 crc kubenswrapper[4816]: I0316 00:36:35.772015 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sasl-config\" (UniqueName: \"kubernetes.io/configmap/99cc3f5f-4a76-4ef9-9001-dda6329b3fe0-sasl-config\") pod \"default-interconnect-68864d46cb-pdx9k\" (UID: \"99cc3f5f-4a76-4ef9-9001-dda6329b3fe0\") " pod="service-telemetry/default-interconnect-68864d46cb-pdx9k" Mar 16 00:36:35 crc kubenswrapper[4816]: I0316 00:36:35.772053 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-interconnect-openstack-credentials\" (UniqueName: \"kubernetes.io/secret/99cc3f5f-4a76-4ef9-9001-dda6329b3fe0-default-interconnect-openstack-credentials\") pod \"default-interconnect-68864d46cb-pdx9k\" (UID: \"99cc3f5f-4a76-4ef9-9001-dda6329b3fe0\") " pod="service-telemetry/default-interconnect-68864d46cb-pdx9k" Mar 16 00:36:35 crc kubenswrapper[4816]: I0316 00:36:35.772144 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-interconnect-inter-router-ca\" (UniqueName: \"kubernetes.io/secret/99cc3f5f-4a76-4ef9-9001-dda6329b3fe0-default-interconnect-inter-router-ca\") pod \"default-interconnect-68864d46cb-pdx9k\" (UID: \"99cc3f5f-4a76-4ef9-9001-dda6329b3fe0\") " pod="service-telemetry/default-interconnect-68864d46cb-pdx9k" Mar 16 00:36:35 crc kubenswrapper[4816]: I0316 00:36:35.772180 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-interconnect-openstack-ca\" (UniqueName: \"kubernetes.io/secret/99cc3f5f-4a76-4ef9-9001-dda6329b3fe0-default-interconnect-openstack-ca\") pod \"default-interconnect-68864d46cb-pdx9k\" (UID: \"99cc3f5f-4a76-4ef9-9001-dda6329b3fe0\") " pod="service-telemetry/default-interconnect-68864d46cb-pdx9k" Mar 16 00:36:35 crc kubenswrapper[4816]: I0316 00:36:35.772207 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sasl-users\" (UniqueName: \"kubernetes.io/secret/99cc3f5f-4a76-4ef9-9001-dda6329b3fe0-sasl-users\") pod \"default-interconnect-68864d46cb-pdx9k\" (UID: \"99cc3f5f-4a76-4ef9-9001-dda6329b3fe0\") " pod="service-telemetry/default-interconnect-68864d46cb-pdx9k" Mar 16 00:36:35 crc kubenswrapper[4816]: I0316 00:36:35.873612 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-interconnect-inter-router-ca\" (UniqueName: \"kubernetes.io/secret/99cc3f5f-4a76-4ef9-9001-dda6329b3fe0-default-interconnect-inter-router-ca\") pod \"default-interconnect-68864d46cb-pdx9k\" (UID: \"99cc3f5f-4a76-4ef9-9001-dda6329b3fe0\") " pod="service-telemetry/default-interconnect-68864d46cb-pdx9k" Mar 16 00:36:35 crc kubenswrapper[4816]: I0316 00:36:35.873746 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-interconnect-openstack-ca\" (UniqueName: \"kubernetes.io/secret/99cc3f5f-4a76-4ef9-9001-dda6329b3fe0-default-interconnect-openstack-ca\") pod \"default-interconnect-68864d46cb-pdx9k\" (UID: \"99cc3f5f-4a76-4ef9-9001-dda6329b3fe0\") " pod="service-telemetry/default-interconnect-68864d46cb-pdx9k" Mar 16 00:36:35 crc kubenswrapper[4816]: I0316 00:36:35.873785 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sasl-users\" (UniqueName: \"kubernetes.io/secret/99cc3f5f-4a76-4ef9-9001-dda6329b3fe0-sasl-users\") pod \"default-interconnect-68864d46cb-pdx9k\" (UID: \"99cc3f5f-4a76-4ef9-9001-dda6329b3fe0\") " pod="service-telemetry/default-interconnect-68864d46cb-pdx9k" Mar 16 00:36:35 crc kubenswrapper[4816]: I0316 00:36:35.873875 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-interconnect-inter-router-credentials\" (UniqueName: \"kubernetes.io/secret/99cc3f5f-4a76-4ef9-9001-dda6329b3fe0-default-interconnect-inter-router-credentials\") pod \"default-interconnect-68864d46cb-pdx9k\" (UID: \"99cc3f5f-4a76-4ef9-9001-dda6329b3fe0\") " pod="service-telemetry/default-interconnect-68864d46cb-pdx9k" Mar 16 00:36:35 crc kubenswrapper[4816]: I0316 00:36:35.873925 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h9x2x\" (UniqueName: \"kubernetes.io/projected/99cc3f5f-4a76-4ef9-9001-dda6329b3fe0-kube-api-access-h9x2x\") pod \"default-interconnect-68864d46cb-pdx9k\" (UID: \"99cc3f5f-4a76-4ef9-9001-dda6329b3fe0\") " pod="service-telemetry/default-interconnect-68864d46cb-pdx9k" Mar 16 00:36:35 crc kubenswrapper[4816]: I0316 00:36:35.873970 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sasl-config\" (UniqueName: \"kubernetes.io/configmap/99cc3f5f-4a76-4ef9-9001-dda6329b3fe0-sasl-config\") pod \"default-interconnect-68864d46cb-pdx9k\" (UID: \"99cc3f5f-4a76-4ef9-9001-dda6329b3fe0\") " pod="service-telemetry/default-interconnect-68864d46cb-pdx9k" Mar 16 00:36:35 crc kubenswrapper[4816]: I0316 00:36:35.874028 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-interconnect-openstack-credentials\" (UniqueName: \"kubernetes.io/secret/99cc3f5f-4a76-4ef9-9001-dda6329b3fe0-default-interconnect-openstack-credentials\") pod \"default-interconnect-68864d46cb-pdx9k\" (UID: \"99cc3f5f-4a76-4ef9-9001-dda6329b3fe0\") " pod="service-telemetry/default-interconnect-68864d46cb-pdx9k" Mar 16 00:36:35 crc kubenswrapper[4816]: I0316 00:36:35.875370 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sasl-config\" (UniqueName: \"kubernetes.io/configmap/99cc3f5f-4a76-4ef9-9001-dda6329b3fe0-sasl-config\") pod \"default-interconnect-68864d46cb-pdx9k\" (UID: \"99cc3f5f-4a76-4ef9-9001-dda6329b3fe0\") " pod="service-telemetry/default-interconnect-68864d46cb-pdx9k" Mar 16 00:36:35 crc kubenswrapper[4816]: I0316 00:36:35.879051 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-interconnect-inter-router-credentials\" (UniqueName: \"kubernetes.io/secret/99cc3f5f-4a76-4ef9-9001-dda6329b3fe0-default-interconnect-inter-router-credentials\") pod \"default-interconnect-68864d46cb-pdx9k\" (UID: \"99cc3f5f-4a76-4ef9-9001-dda6329b3fe0\") " pod="service-telemetry/default-interconnect-68864d46cb-pdx9k" Mar 16 00:36:35 crc kubenswrapper[4816]: I0316 00:36:35.879707 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-interconnect-openstack-ca\" (UniqueName: \"kubernetes.io/secret/99cc3f5f-4a76-4ef9-9001-dda6329b3fe0-default-interconnect-openstack-ca\") pod \"default-interconnect-68864d46cb-pdx9k\" (UID: \"99cc3f5f-4a76-4ef9-9001-dda6329b3fe0\") " pod="service-telemetry/default-interconnect-68864d46cb-pdx9k" Mar 16 00:36:35 crc kubenswrapper[4816]: I0316 00:36:35.879949 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sasl-users\" (UniqueName: \"kubernetes.io/secret/99cc3f5f-4a76-4ef9-9001-dda6329b3fe0-sasl-users\") pod \"default-interconnect-68864d46cb-pdx9k\" (UID: \"99cc3f5f-4a76-4ef9-9001-dda6329b3fe0\") " pod="service-telemetry/default-interconnect-68864d46cb-pdx9k" Mar 16 00:36:35 crc kubenswrapper[4816]: I0316 00:36:35.880466 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-interconnect-inter-router-ca\" (UniqueName: \"kubernetes.io/secret/99cc3f5f-4a76-4ef9-9001-dda6329b3fe0-default-interconnect-inter-router-ca\") pod \"default-interconnect-68864d46cb-pdx9k\" (UID: \"99cc3f5f-4a76-4ef9-9001-dda6329b3fe0\") " pod="service-telemetry/default-interconnect-68864d46cb-pdx9k" Mar 16 00:36:35 crc kubenswrapper[4816]: I0316 00:36:35.882069 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-interconnect-openstack-credentials\" (UniqueName: \"kubernetes.io/secret/99cc3f5f-4a76-4ef9-9001-dda6329b3fe0-default-interconnect-openstack-credentials\") pod \"default-interconnect-68864d46cb-pdx9k\" (UID: \"99cc3f5f-4a76-4ef9-9001-dda6329b3fe0\") " pod="service-telemetry/default-interconnect-68864d46cb-pdx9k" Mar 16 00:36:35 crc kubenswrapper[4816]: I0316 00:36:35.891993 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h9x2x\" (UniqueName: \"kubernetes.io/projected/99cc3f5f-4a76-4ef9-9001-dda6329b3fe0-kube-api-access-h9x2x\") pod \"default-interconnect-68864d46cb-pdx9k\" (UID: \"99cc3f5f-4a76-4ef9-9001-dda6329b3fe0\") " pod="service-telemetry/default-interconnect-68864d46cb-pdx9k" Mar 16 00:36:35 crc kubenswrapper[4816]: I0316 00:36:35.956723 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/default-interconnect-68864d46cb-pdx9k" Mar 16 00:36:35 crc kubenswrapper[4816]: I0316 00:36:35.978531 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-coll-event-smartgateway-598dc6844-7wkmc" event={"ID":"71ca3af9-1a2f-4bd2-898a-13c9089b16c2","Type":"ContainerStarted","Data":"e016e48a78797dc3b0cf64c1397f7cda09474ea1f3aadd3add5fdda8aa3efae5"} Mar 16 00:36:35 crc kubenswrapper[4816]: I0316 00:36:35.986184 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-lr826" event={"ID":"ee673348-980c-44f5-8e33-71a859ce740c","Type":"ContainerStarted","Data":"71358c543140d46dcc9029d7f9ba4f05810ef31f0e4a20fc51f4ce7cf5f8e9fe"} Mar 16 00:36:36 crc kubenswrapper[4816]: I0316 00:36:36.006878 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-m8n7t" event={"ID":"4de6f751-2471-4ce9-a771-00703e7be02a","Type":"ContainerStarted","Data":"1d9c2a4c897d022060b76111731aab393689e9fa548bd986e485af3c95f78535"} Mar 16 00:36:36 crc kubenswrapper[4816]: I0316 00:36:36.028107 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-ceil-event-smartgateway-6cc4dc6457-55sts" event={"ID":"134459cc-413b-4996-a0c1-aafe8dae8ebb","Type":"ContainerStarted","Data":"d197a4e61a58561f29290e7cda9877096c50a36f6c773e38c2083c2bb6fe68f3"} Mar 16 00:36:36 crc kubenswrapper[4816]: I0316 00:36:36.061525 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-gjt8v" event={"ID":"a7c7b38e-dd7e-469c-ab38-173944ca2943","Type":"ContainerStarted","Data":"9de64e014f9263fa521079f37ae8e5d94487bd630409f3a42736c5916a55fc3d"} Mar 16 00:36:37 crc kubenswrapper[4816]: I0316 00:36:37.014241 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/default-interconnect-68864d46cb-pdx9k"] Mar 16 00:36:37 crc kubenswrapper[4816]: I0316 00:36:37.075285 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-interconnect-68864d46cb-pdx9k" event={"ID":"99cc3f5f-4a76-4ef9-9001-dda6329b3fe0","Type":"ContainerStarted","Data":"6447d5e9ac68ceaa250e584e16281615c8ee2affd7076b26c02ac59d7bbb5483"} Mar 16 00:36:37 crc kubenswrapper[4816]: I0316 00:36:37.079257 4816 generic.go:334] "Generic (PLEG): container finished" podID="a7c7b38e-dd7e-469c-ab38-173944ca2943" containerID="9de64e014f9263fa521079f37ae8e5d94487bd630409f3a42736c5916a55fc3d" exitCode=0 Mar 16 00:36:37 crc kubenswrapper[4816]: I0316 00:36:37.079363 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-gjt8v" event={"ID":"a7c7b38e-dd7e-469c-ab38-173944ca2943","Type":"ContainerDied","Data":"9de64e014f9263fa521079f37ae8e5d94487bd630409f3a42736c5916a55fc3d"} Mar 16 00:36:37 crc kubenswrapper[4816]: I0316 00:36:37.083681 4816 scope.go:117] "RemoveContainer" containerID="38d2258b8e0ac413732a26743c1c7812f0e79b91c6b5bd0ac251d5ae2c83223c" Mar 16 00:36:37 crc kubenswrapper[4816]: I0316 00:36:37.084643 4816 scope.go:117] "RemoveContainer" containerID="9de64e014f9263fa521079f37ae8e5d94487bd630409f3a42736c5916a55fc3d" Mar 16 00:36:37 crc kubenswrapper[4816]: E0316 00:36:37.084951 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"bridge\" with CrashLoopBackOff: \"back-off 10s restarting failed container=bridge pod=default-cloud1-ceil-meter-smartgateway-57948895dc-gjt8v_service-telemetry(a7c7b38e-dd7e-469c-ab38-173944ca2943)\"" pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-gjt8v" podUID="a7c7b38e-dd7e-469c-ab38-173944ca2943" Mar 16 00:36:37 crc kubenswrapper[4816]: I0316 00:36:37.094358 4816 generic.go:334] "Generic (PLEG): container finished" podID="71ca3af9-1a2f-4bd2-898a-13c9089b16c2" containerID="e016e48a78797dc3b0cf64c1397f7cda09474ea1f3aadd3add5fdda8aa3efae5" exitCode=0 Mar 16 00:36:37 crc kubenswrapper[4816]: I0316 00:36:37.094453 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-coll-event-smartgateway-598dc6844-7wkmc" event={"ID":"71ca3af9-1a2f-4bd2-898a-13c9089b16c2","Type":"ContainerDied","Data":"e016e48a78797dc3b0cf64c1397f7cda09474ea1f3aadd3add5fdda8aa3efae5"} Mar 16 00:36:37 crc kubenswrapper[4816]: I0316 00:36:37.095234 4816 scope.go:117] "RemoveContainer" containerID="e016e48a78797dc3b0cf64c1397f7cda09474ea1f3aadd3add5fdda8aa3efae5" Mar 16 00:36:37 crc kubenswrapper[4816]: E0316 00:36:37.095738 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"bridge\" with CrashLoopBackOff: \"back-off 10s restarting failed container=bridge pod=default-cloud1-coll-event-smartgateway-598dc6844-7wkmc_service-telemetry(71ca3af9-1a2f-4bd2-898a-13c9089b16c2)\"" pod="service-telemetry/default-cloud1-coll-event-smartgateway-598dc6844-7wkmc" podUID="71ca3af9-1a2f-4bd2-898a-13c9089b16c2" Mar 16 00:36:37 crc kubenswrapper[4816]: I0316 00:36:37.111019 4816 generic.go:334] "Generic (PLEG): container finished" podID="ee673348-980c-44f5-8e33-71a859ce740c" containerID="71358c543140d46dcc9029d7f9ba4f05810ef31f0e4a20fc51f4ce7cf5f8e9fe" exitCode=0 Mar 16 00:36:37 crc kubenswrapper[4816]: I0316 00:36:37.111113 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-lr826" event={"ID":"ee673348-980c-44f5-8e33-71a859ce740c","Type":"ContainerDied","Data":"71358c543140d46dcc9029d7f9ba4f05810ef31f0e4a20fc51f4ce7cf5f8e9fe"} Mar 16 00:36:37 crc kubenswrapper[4816]: I0316 00:36:37.111884 4816 scope.go:117] "RemoveContainer" containerID="71358c543140d46dcc9029d7f9ba4f05810ef31f0e4a20fc51f4ce7cf5f8e9fe" Mar 16 00:36:37 crc kubenswrapper[4816]: E0316 00:36:37.112149 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"bridge\" with CrashLoopBackOff: \"back-off 10s restarting failed container=bridge pod=default-cloud1-sens-meter-smartgateway-5759b4d97-lr826_service-telemetry(ee673348-980c-44f5-8e33-71a859ce740c)\"" pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-lr826" podUID="ee673348-980c-44f5-8e33-71a859ce740c" Mar 16 00:36:37 crc kubenswrapper[4816]: I0316 00:36:37.115153 4816 generic.go:334] "Generic (PLEG): container finished" podID="4de6f751-2471-4ce9-a771-00703e7be02a" containerID="1d9c2a4c897d022060b76111731aab393689e9fa548bd986e485af3c95f78535" exitCode=0 Mar 16 00:36:37 crc kubenswrapper[4816]: I0316 00:36:37.115220 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-m8n7t" event={"ID":"4de6f751-2471-4ce9-a771-00703e7be02a","Type":"ContainerDied","Data":"1d9c2a4c897d022060b76111731aab393689e9fa548bd986e485af3c95f78535"} Mar 16 00:36:37 crc kubenswrapper[4816]: I0316 00:36:37.116029 4816 scope.go:117] "RemoveContainer" containerID="1d9c2a4c897d022060b76111731aab393689e9fa548bd986e485af3c95f78535" Mar 16 00:36:37 crc kubenswrapper[4816]: E0316 00:36:37.116319 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"bridge\" with CrashLoopBackOff: \"back-off 10s restarting failed container=bridge pod=default-cloud1-coll-meter-smartgateway-7cd87f9766-m8n7t_service-telemetry(4de6f751-2471-4ce9-a771-00703e7be02a)\"" pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-m8n7t" podUID="4de6f751-2471-4ce9-a771-00703e7be02a" Mar 16 00:36:37 crc kubenswrapper[4816]: I0316 00:36:37.118000 4816 generic.go:334] "Generic (PLEG): container finished" podID="134459cc-413b-4996-a0c1-aafe8dae8ebb" containerID="d197a4e61a58561f29290e7cda9877096c50a36f6c773e38c2083c2bb6fe68f3" exitCode=0 Mar 16 00:36:37 crc kubenswrapper[4816]: I0316 00:36:37.118023 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-ceil-event-smartgateway-6cc4dc6457-55sts" event={"ID":"134459cc-413b-4996-a0c1-aafe8dae8ebb","Type":"ContainerDied","Data":"d197a4e61a58561f29290e7cda9877096c50a36f6c773e38c2083c2bb6fe68f3"} Mar 16 00:36:37 crc kubenswrapper[4816]: I0316 00:36:37.118406 4816 scope.go:117] "RemoveContainer" containerID="d197a4e61a58561f29290e7cda9877096c50a36f6c773e38c2083c2bb6fe68f3" Mar 16 00:36:37 crc kubenswrapper[4816]: E0316 00:36:37.118661 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"bridge\" with CrashLoopBackOff: \"back-off 10s restarting failed container=bridge pod=default-cloud1-ceil-event-smartgateway-6cc4dc6457-55sts_service-telemetry(134459cc-413b-4996-a0c1-aafe8dae8ebb)\"" pod="service-telemetry/default-cloud1-ceil-event-smartgateway-6cc4dc6457-55sts" podUID="134459cc-413b-4996-a0c1-aafe8dae8ebb" Mar 16 00:36:37 crc kubenswrapper[4816]: I0316 00:36:37.166126 4816 scope.go:117] "RemoveContainer" containerID="ce9e99a9c02686068162a7b07011e1f6804e66a5e36c18c8a59058d037d454f1" Mar 16 00:36:37 crc kubenswrapper[4816]: I0316 00:36:37.274686 4816 scope.go:117] "RemoveContainer" containerID="4d8e0927fe2232f6e091b4714aa5409aa783badeda2f3341d66e596881120b91" Mar 16 00:36:37 crc kubenswrapper[4816]: I0316 00:36:37.330240 4816 scope.go:117] "RemoveContainer" containerID="f9d1d4fbe1538ba3d5bfad270eb0c25d99d3990476fb2033e0544e3446cf984e" Mar 16 00:36:37 crc kubenswrapper[4816]: I0316 00:36:37.378316 4816 scope.go:117] "RemoveContainer" containerID="5d592ef65f51434a32e7ae05e9ef625d7d9825e6f8d4a65e1163ca86b1e07ca1" Mar 16 00:36:38 crc kubenswrapper[4816]: I0316 00:36:38.130804 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-interconnect-68864d46cb-pdx9k" event={"ID":"99cc3f5f-4a76-4ef9-9001-dda6329b3fe0","Type":"ContainerStarted","Data":"9ccc389f0ff36e4f9911d642e3cd6149ad785d77e5645e73cd4748b136fb9338"} Mar 16 00:36:38 crc kubenswrapper[4816]: I0316 00:36:38.155041 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/default-interconnect-68864d46cb-pdx9k" podStartSLOduration=5.155004348 podStartE2EDuration="5.155004348s" podCreationTimestamp="2026-03-16 00:36:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-16 00:36:38.148469654 +0000 UTC m=+1791.244769607" watchObservedRunningTime="2026-03-16 00:36:38.155004348 +0000 UTC m=+1791.251304301" Mar 16 00:36:39 crc kubenswrapper[4816]: I0316 00:36:39.245089 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/qdr-test"] Mar 16 00:36:39 crc kubenswrapper[4816]: I0316 00:36:39.246526 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/qdr-test" Mar 16 00:36:39 crc kubenswrapper[4816]: I0316 00:36:39.249148 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"qdr-test-config" Mar 16 00:36:39 crc kubenswrapper[4816]: I0316 00:36:39.249512 4816 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"default-interconnect-selfsigned" Mar 16 00:36:39 crc kubenswrapper[4816]: I0316 00:36:39.255333 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/qdr-test"] Mar 16 00:36:39 crc kubenswrapper[4816]: I0316 00:36:39.358975 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"qdr-test-config\" (UniqueName: \"kubernetes.io/configmap/e014dd01-f826-4642-bb43-dbdab4a1e503-qdr-test-config\") pod \"qdr-test\" (UID: \"e014dd01-f826-4642-bb43-dbdab4a1e503\") " pod="service-telemetry/qdr-test" Mar 16 00:36:39 crc kubenswrapper[4816]: I0316 00:36:39.359088 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-48w8l\" (UniqueName: \"kubernetes.io/projected/e014dd01-f826-4642-bb43-dbdab4a1e503-kube-api-access-48w8l\") pod \"qdr-test\" (UID: \"e014dd01-f826-4642-bb43-dbdab4a1e503\") " pod="service-telemetry/qdr-test" Mar 16 00:36:39 crc kubenswrapper[4816]: I0316 00:36:39.359203 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-interconnect-selfsigned-cert\" (UniqueName: \"kubernetes.io/secret/e014dd01-f826-4642-bb43-dbdab4a1e503-default-interconnect-selfsigned-cert\") pod \"qdr-test\" (UID: \"e014dd01-f826-4642-bb43-dbdab4a1e503\") " pod="service-telemetry/qdr-test" Mar 16 00:36:39 crc kubenswrapper[4816]: I0316 00:36:39.461590 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-interconnect-selfsigned-cert\" (UniqueName: \"kubernetes.io/secret/e014dd01-f826-4642-bb43-dbdab4a1e503-default-interconnect-selfsigned-cert\") pod \"qdr-test\" (UID: \"e014dd01-f826-4642-bb43-dbdab4a1e503\") " pod="service-telemetry/qdr-test" Mar 16 00:36:39 crc kubenswrapper[4816]: I0316 00:36:39.462051 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"qdr-test-config\" (UniqueName: \"kubernetes.io/configmap/e014dd01-f826-4642-bb43-dbdab4a1e503-qdr-test-config\") pod \"qdr-test\" (UID: \"e014dd01-f826-4642-bb43-dbdab4a1e503\") " pod="service-telemetry/qdr-test" Mar 16 00:36:39 crc kubenswrapper[4816]: I0316 00:36:39.462087 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-48w8l\" (UniqueName: \"kubernetes.io/projected/e014dd01-f826-4642-bb43-dbdab4a1e503-kube-api-access-48w8l\") pod \"qdr-test\" (UID: \"e014dd01-f826-4642-bb43-dbdab4a1e503\") " pod="service-telemetry/qdr-test" Mar 16 00:36:39 crc kubenswrapper[4816]: I0316 00:36:39.463006 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"qdr-test-config\" (UniqueName: \"kubernetes.io/configmap/e014dd01-f826-4642-bb43-dbdab4a1e503-qdr-test-config\") pod \"qdr-test\" (UID: \"e014dd01-f826-4642-bb43-dbdab4a1e503\") " pod="service-telemetry/qdr-test" Mar 16 00:36:39 crc kubenswrapper[4816]: I0316 00:36:39.484692 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-interconnect-selfsigned-cert\" (UniqueName: \"kubernetes.io/secret/e014dd01-f826-4642-bb43-dbdab4a1e503-default-interconnect-selfsigned-cert\") pod \"qdr-test\" (UID: \"e014dd01-f826-4642-bb43-dbdab4a1e503\") " pod="service-telemetry/qdr-test" Mar 16 00:36:39 crc kubenswrapper[4816]: I0316 00:36:39.486179 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-48w8l\" (UniqueName: \"kubernetes.io/projected/e014dd01-f826-4642-bb43-dbdab4a1e503-kube-api-access-48w8l\") pod \"qdr-test\" (UID: \"e014dd01-f826-4642-bb43-dbdab4a1e503\") " pod="service-telemetry/qdr-test" Mar 16 00:36:39 crc kubenswrapper[4816]: I0316 00:36:39.685543 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/qdr-test" Mar 16 00:36:40 crc kubenswrapper[4816]: I0316 00:36:40.106644 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/qdr-test"] Mar 16 00:36:40 crc kubenswrapper[4816]: I0316 00:36:40.154257 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/qdr-test" event={"ID":"e014dd01-f826-4642-bb43-dbdab4a1e503","Type":"ContainerStarted","Data":"67c52596879ee5f97550865d1d0d9f1c65faddf356aba2e3cf412b3627b9ccd2"} Mar 16 00:36:44 crc kubenswrapper[4816]: I0316 00:36:44.667559 4816 scope.go:117] "RemoveContainer" containerID="ad2d141490ffe2b1f91c3a2296efeb65e936382d1300bb9398efe83e779ccd8a" Mar 16 00:36:44 crc kubenswrapper[4816]: E0316 00:36:44.668214 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jrdcz_openshift-machine-config-operator(dd08ece2-7636-4966-973a-e96a34b70b53)\"" pod="openshift-machine-config-operator/machine-config-daemon-jrdcz" podUID="dd08ece2-7636-4966-973a-e96a34b70b53" Mar 16 00:36:48 crc kubenswrapper[4816]: I0316 00:36:48.667389 4816 scope.go:117] "RemoveContainer" containerID="d197a4e61a58561f29290e7cda9877096c50a36f6c773e38c2083c2bb6fe68f3" Mar 16 00:36:48 crc kubenswrapper[4816]: I0316 00:36:48.669357 4816 scope.go:117] "RemoveContainer" containerID="71358c543140d46dcc9029d7f9ba4f05810ef31f0e4a20fc51f4ce7cf5f8e9fe" Mar 16 00:36:50 crc kubenswrapper[4816]: I0316 00:36:50.668200 4816 scope.go:117] "RemoveContainer" containerID="e016e48a78797dc3b0cf64c1397f7cda09474ea1f3aadd3add5fdda8aa3efae5" Mar 16 00:36:51 crc kubenswrapper[4816]: I0316 00:36:51.241328 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-ceil-event-smartgateway-6cc4dc6457-55sts" event={"ID":"134459cc-413b-4996-a0c1-aafe8dae8ebb","Type":"ContainerStarted","Data":"5f3024deeed51fd8bdc966d9c0ccf9f70fef0634d72805adf14cd3fe5250b028"} Mar 16 00:36:51 crc kubenswrapper[4816]: I0316 00:36:51.243055 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/qdr-test" event={"ID":"e014dd01-f826-4642-bb43-dbdab4a1e503","Type":"ContainerStarted","Data":"269fbfe42442cb6eba5484dbb42362b640f52dbb1f8ab10fd7c9f6ce790f6fb9"} Mar 16 00:36:51 crc kubenswrapper[4816]: I0316 00:36:51.249899 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-coll-event-smartgateway-598dc6844-7wkmc" event={"ID":"71ca3af9-1a2f-4bd2-898a-13c9089b16c2","Type":"ContainerStarted","Data":"dffc633da63589e27e68b277b5f3d8bc25104b4b4f15449073562fc7d1e7d31e"} Mar 16 00:36:51 crc kubenswrapper[4816]: I0316 00:36:51.256665 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-lr826" event={"ID":"ee673348-980c-44f5-8e33-71a859ce740c","Type":"ContainerStarted","Data":"4664d3b7085c3c707a8f0d1c0767e164bf765d5f82870b12f6105bae899d19e0"} Mar 16 00:36:51 crc kubenswrapper[4816]: I0316 00:36:51.314565 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/qdr-test" podStartSLOduration=2.121244039 podStartE2EDuration="12.314532807s" podCreationTimestamp="2026-03-16 00:36:39 +0000 UTC" firstStartedPulling="2026-03-16 00:36:40.111355022 +0000 UTC m=+1793.207654975" lastFinishedPulling="2026-03-16 00:36:50.30464379 +0000 UTC m=+1803.400943743" observedRunningTime="2026-03-16 00:36:51.296282322 +0000 UTC m=+1804.392582295" watchObservedRunningTime="2026-03-16 00:36:51.314532807 +0000 UTC m=+1804.410832760" Mar 16 00:36:51 crc kubenswrapper[4816]: I0316 00:36:51.591674 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/stf-smoketest-smoke1-78t94"] Mar 16 00:36:51 crc kubenswrapper[4816]: I0316 00:36:51.594881 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/stf-smoketest-smoke1-78t94" Mar 16 00:36:51 crc kubenswrapper[4816]: I0316 00:36:51.604034 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"stf-smoketest-ceilometer-entrypoint-script" Mar 16 00:36:51 crc kubenswrapper[4816]: I0316 00:36:51.604307 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"stf-smoketest-healthcheck-log" Mar 16 00:36:51 crc kubenswrapper[4816]: I0316 00:36:51.604461 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"stf-smoketest-collectd-entrypoint-script" Mar 16 00:36:51 crc kubenswrapper[4816]: I0316 00:36:51.604631 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"stf-smoketest-sensubility-config" Mar 16 00:36:51 crc kubenswrapper[4816]: I0316 00:36:51.605990 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"stf-smoketest-collectd-config" Mar 16 00:36:51 crc kubenswrapper[4816]: I0316 00:36:51.606026 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"stf-smoketest-ceilometer-publisher" Mar 16 00:36:51 crc kubenswrapper[4816]: I0316 00:36:51.636803 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/stf-smoketest-smoke1-78t94"] Mar 16 00:36:51 crc kubenswrapper[4816]: I0316 00:36:51.726964 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"healthcheck-log\" (UniqueName: \"kubernetes.io/configmap/bfb5a27e-f6df-44ba-8f68-446946410953-healthcheck-log\") pod \"stf-smoketest-smoke1-78t94\" (UID: \"bfb5a27e-f6df-44ba-8f68-446946410953\") " pod="service-telemetry/stf-smoketest-smoke1-78t94" Mar 16 00:36:51 crc kubenswrapper[4816]: I0316 00:36:51.727033 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"collectd-entrypoint-script\" (UniqueName: \"kubernetes.io/configmap/bfb5a27e-f6df-44ba-8f68-446946410953-collectd-entrypoint-script\") pod \"stf-smoketest-smoke1-78t94\" (UID: \"bfb5a27e-f6df-44ba-8f68-446946410953\") " pod="service-telemetry/stf-smoketest-smoke1-78t94" Mar 16 00:36:51 crc kubenswrapper[4816]: I0316 00:36:51.727061 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-publisher\" (UniqueName: \"kubernetes.io/configmap/bfb5a27e-f6df-44ba-8f68-446946410953-ceilometer-publisher\") pod \"stf-smoketest-smoke1-78t94\" (UID: \"bfb5a27e-f6df-44ba-8f68-446946410953\") " pod="service-telemetry/stf-smoketest-smoke1-78t94" Mar 16 00:36:51 crc kubenswrapper[4816]: I0316 00:36:51.727081 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"collectd-config\" (UniqueName: \"kubernetes.io/configmap/bfb5a27e-f6df-44ba-8f68-446946410953-collectd-config\") pod \"stf-smoketest-smoke1-78t94\" (UID: \"bfb5a27e-f6df-44ba-8f68-446946410953\") " pod="service-telemetry/stf-smoketest-smoke1-78t94" Mar 16 00:36:51 crc kubenswrapper[4816]: I0316 00:36:51.727139 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sensubility-config\" (UniqueName: \"kubernetes.io/configmap/bfb5a27e-f6df-44ba-8f68-446946410953-sensubility-config\") pod \"stf-smoketest-smoke1-78t94\" (UID: \"bfb5a27e-f6df-44ba-8f68-446946410953\") " pod="service-telemetry/stf-smoketest-smoke1-78t94" Mar 16 00:36:51 crc kubenswrapper[4816]: I0316 00:36:51.727156 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-entrypoint-script\" (UniqueName: \"kubernetes.io/configmap/bfb5a27e-f6df-44ba-8f68-446946410953-ceilometer-entrypoint-script\") pod \"stf-smoketest-smoke1-78t94\" (UID: \"bfb5a27e-f6df-44ba-8f68-446946410953\") " pod="service-telemetry/stf-smoketest-smoke1-78t94" Mar 16 00:36:51 crc kubenswrapper[4816]: I0316 00:36:51.727200 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mj6gg\" (UniqueName: \"kubernetes.io/projected/bfb5a27e-f6df-44ba-8f68-446946410953-kube-api-access-mj6gg\") pod \"stf-smoketest-smoke1-78t94\" (UID: \"bfb5a27e-f6df-44ba-8f68-446946410953\") " pod="service-telemetry/stf-smoketest-smoke1-78t94" Mar 16 00:36:51 crc kubenswrapper[4816]: I0316 00:36:51.828529 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"healthcheck-log\" (UniqueName: \"kubernetes.io/configmap/bfb5a27e-f6df-44ba-8f68-446946410953-healthcheck-log\") pod \"stf-smoketest-smoke1-78t94\" (UID: \"bfb5a27e-f6df-44ba-8f68-446946410953\") " pod="service-telemetry/stf-smoketest-smoke1-78t94" Mar 16 00:36:51 crc kubenswrapper[4816]: I0316 00:36:51.829077 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"collectd-entrypoint-script\" (UniqueName: \"kubernetes.io/configmap/bfb5a27e-f6df-44ba-8f68-446946410953-collectd-entrypoint-script\") pod \"stf-smoketest-smoke1-78t94\" (UID: \"bfb5a27e-f6df-44ba-8f68-446946410953\") " pod="service-telemetry/stf-smoketest-smoke1-78t94" Mar 16 00:36:51 crc kubenswrapper[4816]: I0316 00:36:51.829295 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-publisher\" (UniqueName: \"kubernetes.io/configmap/bfb5a27e-f6df-44ba-8f68-446946410953-ceilometer-publisher\") pod \"stf-smoketest-smoke1-78t94\" (UID: \"bfb5a27e-f6df-44ba-8f68-446946410953\") " pod="service-telemetry/stf-smoketest-smoke1-78t94" Mar 16 00:36:51 crc kubenswrapper[4816]: I0316 00:36:51.830175 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"healthcheck-log\" (UniqueName: \"kubernetes.io/configmap/bfb5a27e-f6df-44ba-8f68-446946410953-healthcheck-log\") pod \"stf-smoketest-smoke1-78t94\" (UID: \"bfb5a27e-f6df-44ba-8f68-446946410953\") " pod="service-telemetry/stf-smoketest-smoke1-78t94" Mar 16 00:36:51 crc kubenswrapper[4816]: I0316 00:36:51.830204 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"collectd-entrypoint-script\" (UniqueName: \"kubernetes.io/configmap/bfb5a27e-f6df-44ba-8f68-446946410953-collectd-entrypoint-script\") pod \"stf-smoketest-smoke1-78t94\" (UID: \"bfb5a27e-f6df-44ba-8f68-446946410953\") " pod="service-telemetry/stf-smoketest-smoke1-78t94" Mar 16 00:36:51 crc kubenswrapper[4816]: I0316 00:36:51.830710 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-publisher\" (UniqueName: \"kubernetes.io/configmap/bfb5a27e-f6df-44ba-8f68-446946410953-ceilometer-publisher\") pod \"stf-smoketest-smoke1-78t94\" (UID: \"bfb5a27e-f6df-44ba-8f68-446946410953\") " pod="service-telemetry/stf-smoketest-smoke1-78t94" Mar 16 00:36:51 crc kubenswrapper[4816]: I0316 00:36:51.830845 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"collectd-config\" (UniqueName: \"kubernetes.io/configmap/bfb5a27e-f6df-44ba-8f68-446946410953-collectd-config\") pod \"stf-smoketest-smoke1-78t94\" (UID: \"bfb5a27e-f6df-44ba-8f68-446946410953\") " pod="service-telemetry/stf-smoketest-smoke1-78t94" Mar 16 00:36:51 crc kubenswrapper[4816]: I0316 00:36:51.831862 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sensubility-config\" (UniqueName: \"kubernetes.io/configmap/bfb5a27e-f6df-44ba-8f68-446946410953-sensubility-config\") pod \"stf-smoketest-smoke1-78t94\" (UID: \"bfb5a27e-f6df-44ba-8f68-446946410953\") " pod="service-telemetry/stf-smoketest-smoke1-78t94" Mar 16 00:36:51 crc kubenswrapper[4816]: I0316 00:36:51.833213 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-entrypoint-script\" (UniqueName: \"kubernetes.io/configmap/bfb5a27e-f6df-44ba-8f68-446946410953-ceilometer-entrypoint-script\") pod \"stf-smoketest-smoke1-78t94\" (UID: \"bfb5a27e-f6df-44ba-8f68-446946410953\") " pod="service-telemetry/stf-smoketest-smoke1-78t94" Mar 16 00:36:51 crc kubenswrapper[4816]: I0316 00:36:51.834140 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mj6gg\" (UniqueName: \"kubernetes.io/projected/bfb5a27e-f6df-44ba-8f68-446946410953-kube-api-access-mj6gg\") pod \"stf-smoketest-smoke1-78t94\" (UID: \"bfb5a27e-f6df-44ba-8f68-446946410953\") " pod="service-telemetry/stf-smoketest-smoke1-78t94" Mar 16 00:36:51 crc kubenswrapper[4816]: I0316 00:36:51.831701 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"collectd-config\" (UniqueName: \"kubernetes.io/configmap/bfb5a27e-f6df-44ba-8f68-446946410953-collectd-config\") pod \"stf-smoketest-smoke1-78t94\" (UID: \"bfb5a27e-f6df-44ba-8f68-446946410953\") " pod="service-telemetry/stf-smoketest-smoke1-78t94" Mar 16 00:36:51 crc kubenswrapper[4816]: I0316 00:36:51.834078 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-entrypoint-script\" (UniqueName: \"kubernetes.io/configmap/bfb5a27e-f6df-44ba-8f68-446946410953-ceilometer-entrypoint-script\") pod \"stf-smoketest-smoke1-78t94\" (UID: \"bfb5a27e-f6df-44ba-8f68-446946410953\") " pod="service-telemetry/stf-smoketest-smoke1-78t94" Mar 16 00:36:51 crc kubenswrapper[4816]: I0316 00:36:51.833166 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sensubility-config\" (UniqueName: \"kubernetes.io/configmap/bfb5a27e-f6df-44ba-8f68-446946410953-sensubility-config\") pod \"stf-smoketest-smoke1-78t94\" (UID: \"bfb5a27e-f6df-44ba-8f68-446946410953\") " pod="service-telemetry/stf-smoketest-smoke1-78t94" Mar 16 00:36:51 crc kubenswrapper[4816]: I0316 00:36:51.861119 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mj6gg\" (UniqueName: \"kubernetes.io/projected/bfb5a27e-f6df-44ba-8f68-446946410953-kube-api-access-mj6gg\") pod \"stf-smoketest-smoke1-78t94\" (UID: \"bfb5a27e-f6df-44ba-8f68-446946410953\") " pod="service-telemetry/stf-smoketest-smoke1-78t94" Mar 16 00:36:51 crc kubenswrapper[4816]: I0316 00:36:51.926529 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/stf-smoketest-smoke1-78t94" Mar 16 00:36:52 crc kubenswrapper[4816]: I0316 00:36:52.002769 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/curl"] Mar 16 00:36:52 crc kubenswrapper[4816]: I0316 00:36:52.006014 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/curl" Mar 16 00:36:52 crc kubenswrapper[4816]: I0316 00:36:52.025879 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/curl"] Mar 16 00:36:52 crc kubenswrapper[4816]: I0316 00:36:52.036765 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vhk6p\" (UniqueName: \"kubernetes.io/projected/a5b0de9a-0a4c-468c-b49f-575ecbc053e3-kube-api-access-vhk6p\") pod \"curl\" (UID: \"a5b0de9a-0a4c-468c-b49f-575ecbc053e3\") " pod="service-telemetry/curl" Mar 16 00:36:52 crc kubenswrapper[4816]: I0316 00:36:52.139330 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vhk6p\" (UniqueName: \"kubernetes.io/projected/a5b0de9a-0a4c-468c-b49f-575ecbc053e3-kube-api-access-vhk6p\") pod \"curl\" (UID: \"a5b0de9a-0a4c-468c-b49f-575ecbc053e3\") " pod="service-telemetry/curl" Mar 16 00:36:52 crc kubenswrapper[4816]: I0316 00:36:52.160186 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vhk6p\" (UniqueName: \"kubernetes.io/projected/a5b0de9a-0a4c-468c-b49f-575ecbc053e3-kube-api-access-vhk6p\") pod \"curl\" (UID: \"a5b0de9a-0a4c-468c-b49f-575ecbc053e3\") " pod="service-telemetry/curl" Mar 16 00:36:52 crc kubenswrapper[4816]: I0316 00:36:52.360929 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/curl" Mar 16 00:36:52 crc kubenswrapper[4816]: W0316 00:36:52.384987 4816 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbfb5a27e_f6df_44ba_8f68_446946410953.slice/crio-b874d2ee7cf3967963c00611ffab38c70d37853ea80cc511f1984da79b61eef1 WatchSource:0}: Error finding container b874d2ee7cf3967963c00611ffab38c70d37853ea80cc511f1984da79b61eef1: Status 404 returned error can't find the container with id b874d2ee7cf3967963c00611ffab38c70d37853ea80cc511f1984da79b61eef1 Mar 16 00:36:52 crc kubenswrapper[4816]: I0316 00:36:52.388030 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/stf-smoketest-smoke1-78t94"] Mar 16 00:36:52 crc kubenswrapper[4816]: I0316 00:36:52.574042 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/curl"] Mar 16 00:36:52 crc kubenswrapper[4816]: I0316 00:36:52.667260 4816 scope.go:117] "RemoveContainer" containerID="1d9c2a4c897d022060b76111731aab393689e9fa548bd986e485af3c95f78535" Mar 16 00:36:52 crc kubenswrapper[4816]: I0316 00:36:52.667601 4816 scope.go:117] "RemoveContainer" containerID="9de64e014f9263fa521079f37ae8e5d94487bd630409f3a42736c5916a55fc3d" Mar 16 00:36:53 crc kubenswrapper[4816]: I0316 00:36:53.286793 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/curl" event={"ID":"a5b0de9a-0a4c-468c-b49f-575ecbc053e3","Type":"ContainerStarted","Data":"10214b7e53a9fba954863af7a37aa60f7cfe09a8c0032bf25fa1b27dc6986052"} Mar 16 00:36:53 crc kubenswrapper[4816]: I0316 00:36:53.293872 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-gjt8v" event={"ID":"a7c7b38e-dd7e-469c-ab38-173944ca2943","Type":"ContainerStarted","Data":"c5909de608a6f63e49b5ed75bb4931e802de58428fa60054194935d2f74d6935"} Mar 16 00:36:53 crc kubenswrapper[4816]: I0316 00:36:53.297284 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/stf-smoketest-smoke1-78t94" event={"ID":"bfb5a27e-f6df-44ba-8f68-446946410953","Type":"ContainerStarted","Data":"b874d2ee7cf3967963c00611ffab38c70d37853ea80cc511f1984da79b61eef1"} Mar 16 00:36:53 crc kubenswrapper[4816]: I0316 00:36:53.325305 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-m8n7t" event={"ID":"4de6f751-2471-4ce9-a771-00703e7be02a","Type":"ContainerStarted","Data":"f9d46a2edd2a0590c87d3bccb06b155ce8d343cace750651acb690e7067885c4"} Mar 16 00:36:55 crc kubenswrapper[4816]: I0316 00:36:55.344823 4816 generic.go:334] "Generic (PLEG): container finished" podID="a5b0de9a-0a4c-468c-b49f-575ecbc053e3" containerID="69e70052395bd582e905946efddaebf9f7da1e715ed33c4428ae00a7088cbadf" exitCode=0 Mar 16 00:36:55 crc kubenswrapper[4816]: I0316 00:36:55.344920 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/curl" event={"ID":"a5b0de9a-0a4c-468c-b49f-575ecbc053e3","Type":"ContainerDied","Data":"69e70052395bd582e905946efddaebf9f7da1e715ed33c4428ae00a7088cbadf"} Mar 16 00:36:58 crc kubenswrapper[4816]: I0316 00:36:58.667454 4816 scope.go:117] "RemoveContainer" containerID="ad2d141490ffe2b1f91c3a2296efeb65e936382d1300bb9398efe83e779ccd8a" Mar 16 00:36:58 crc kubenswrapper[4816]: E0316 00:36:58.668179 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jrdcz_openshift-machine-config-operator(dd08ece2-7636-4966-973a-e96a34b70b53)\"" pod="openshift-machine-config-operator/machine-config-daemon-jrdcz" podUID="dd08ece2-7636-4966-973a-e96a34b70b53" Mar 16 00:37:01 crc kubenswrapper[4816]: I0316 00:37:01.932512 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/curl" Mar 16 00:37:02 crc kubenswrapper[4816]: I0316 00:37:02.018389 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vhk6p\" (UniqueName: \"kubernetes.io/projected/a5b0de9a-0a4c-468c-b49f-575ecbc053e3-kube-api-access-vhk6p\") pod \"a5b0de9a-0a4c-468c-b49f-575ecbc053e3\" (UID: \"a5b0de9a-0a4c-468c-b49f-575ecbc053e3\") " Mar 16 00:37:02 crc kubenswrapper[4816]: I0316 00:37:02.023736 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a5b0de9a-0a4c-468c-b49f-575ecbc053e3-kube-api-access-vhk6p" (OuterVolumeSpecName: "kube-api-access-vhk6p") pod "a5b0de9a-0a4c-468c-b49f-575ecbc053e3" (UID: "a5b0de9a-0a4c-468c-b49f-575ecbc053e3"). InnerVolumeSpecName "kube-api-access-vhk6p". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:37:02 crc kubenswrapper[4816]: I0316 00:37:02.080320 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_curl_a5b0de9a-0a4c-468c-b49f-575ecbc053e3/curl/0.log" Mar 16 00:37:02 crc kubenswrapper[4816]: I0316 00:37:02.120011 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vhk6p\" (UniqueName: \"kubernetes.io/projected/a5b0de9a-0a4c-468c-b49f-575ecbc053e3-kube-api-access-vhk6p\") on node \"crc\" DevicePath \"\"" Mar 16 00:37:02 crc kubenswrapper[4816]: I0316 00:37:02.301199 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_default-snmp-webhook-6856cfb745-2n6hc_5be5ca83-5116-48dc-8d6c-733cbd3e9682/prometheus-webhook-snmp/0.log" Mar 16 00:37:02 crc kubenswrapper[4816]: I0316 00:37:02.395514 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/curl" event={"ID":"a5b0de9a-0a4c-468c-b49f-575ecbc053e3","Type":"ContainerDied","Data":"10214b7e53a9fba954863af7a37aa60f7cfe09a8c0032bf25fa1b27dc6986052"} Mar 16 00:37:02 crc kubenswrapper[4816]: I0316 00:37:02.395567 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/curl" Mar 16 00:37:02 crc kubenswrapper[4816]: I0316 00:37:02.395577 4816 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="10214b7e53a9fba954863af7a37aa60f7cfe09a8c0032bf25fa1b27dc6986052" Mar 16 00:37:04 crc kubenswrapper[4816]: I0316 00:37:04.412385 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/stf-smoketest-smoke1-78t94" event={"ID":"bfb5a27e-f6df-44ba-8f68-446946410953","Type":"ContainerStarted","Data":"dff32730efec159cd6d8cf3ea205ebc1778cc70fb4d2a5f6c005a8a2de2b5ade"} Mar 16 00:37:05 crc kubenswrapper[4816]: I0316 00:37:05.389970 4816 scope.go:117] "RemoveContainer" containerID="a1355d11ec449f6a9fd6597a935b6361539d556da9968192441a1a7760e23960" Mar 16 00:37:11 crc kubenswrapper[4816]: I0316 00:37:11.465129 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/stf-smoketest-smoke1-78t94" event={"ID":"bfb5a27e-f6df-44ba-8f68-446946410953","Type":"ContainerStarted","Data":"694d068f0560de8c0ccec981067fb7a82ff8d6a56fdaddbf1e422c2f99d7ec45"} Mar 16 00:37:11 crc kubenswrapper[4816]: I0316 00:37:11.491791 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/stf-smoketest-smoke1-78t94" podStartSLOduration=2.5798966290000003 podStartE2EDuration="20.491771344s" podCreationTimestamp="2026-03-16 00:36:51 +0000 UTC" firstStartedPulling="2026-03-16 00:36:52.391220988 +0000 UTC m=+1805.487520941" lastFinishedPulling="2026-03-16 00:37:10.303095683 +0000 UTC m=+1823.399395656" observedRunningTime="2026-03-16 00:37:11.485752734 +0000 UTC m=+1824.582052687" watchObservedRunningTime="2026-03-16 00:37:11.491771344 +0000 UTC m=+1824.588071297" Mar 16 00:37:12 crc kubenswrapper[4816]: I0316 00:37:12.667571 4816 scope.go:117] "RemoveContainer" containerID="ad2d141490ffe2b1f91c3a2296efeb65e936382d1300bb9398efe83e779ccd8a" Mar 16 00:37:12 crc kubenswrapper[4816]: E0316 00:37:12.667853 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jrdcz_openshift-machine-config-operator(dd08ece2-7636-4966-973a-e96a34b70b53)\"" pod="openshift-machine-config-operator/machine-config-daemon-jrdcz" podUID="dd08ece2-7636-4966-973a-e96a34b70b53" Mar 16 00:37:23 crc kubenswrapper[4816]: I0316 00:37:23.667672 4816 scope.go:117] "RemoveContainer" containerID="ad2d141490ffe2b1f91c3a2296efeb65e936382d1300bb9398efe83e779ccd8a" Mar 16 00:37:23 crc kubenswrapper[4816]: E0316 00:37:23.668475 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jrdcz_openshift-machine-config-operator(dd08ece2-7636-4966-973a-e96a34b70b53)\"" pod="openshift-machine-config-operator/machine-config-daemon-jrdcz" podUID="dd08ece2-7636-4966-973a-e96a34b70b53" Mar 16 00:37:32 crc kubenswrapper[4816]: I0316 00:37:32.479746 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_default-snmp-webhook-6856cfb745-2n6hc_5be5ca83-5116-48dc-8d6c-733cbd3e9682/prometheus-webhook-snmp/0.log" Mar 16 00:37:35 crc kubenswrapper[4816]: I0316 00:37:35.667175 4816 scope.go:117] "RemoveContainer" containerID="ad2d141490ffe2b1f91c3a2296efeb65e936382d1300bb9398efe83e779ccd8a" Mar 16 00:37:35 crc kubenswrapper[4816]: E0316 00:37:35.667671 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jrdcz_openshift-machine-config-operator(dd08ece2-7636-4966-973a-e96a34b70b53)\"" pod="openshift-machine-config-operator/machine-config-daemon-jrdcz" podUID="dd08ece2-7636-4966-973a-e96a34b70b53" Mar 16 00:37:37 crc kubenswrapper[4816]: I0316 00:37:37.683899 4816 generic.go:334] "Generic (PLEG): container finished" podID="bfb5a27e-f6df-44ba-8f68-446946410953" containerID="dff32730efec159cd6d8cf3ea205ebc1778cc70fb4d2a5f6c005a8a2de2b5ade" exitCode=0 Mar 16 00:37:37 crc kubenswrapper[4816]: I0316 00:37:37.688924 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/stf-smoketest-smoke1-78t94" event={"ID":"bfb5a27e-f6df-44ba-8f68-446946410953","Type":"ContainerDied","Data":"dff32730efec159cd6d8cf3ea205ebc1778cc70fb4d2a5f6c005a8a2de2b5ade"} Mar 16 00:37:37 crc kubenswrapper[4816]: I0316 00:37:37.689908 4816 scope.go:117] "RemoveContainer" containerID="dff32730efec159cd6d8cf3ea205ebc1778cc70fb4d2a5f6c005a8a2de2b5ade" Mar 16 00:37:42 crc kubenswrapper[4816]: I0316 00:37:42.729349 4816 generic.go:334] "Generic (PLEG): container finished" podID="bfb5a27e-f6df-44ba-8f68-446946410953" containerID="694d068f0560de8c0ccec981067fb7a82ff8d6a56fdaddbf1e422c2f99d7ec45" exitCode=0 Mar 16 00:37:42 crc kubenswrapper[4816]: I0316 00:37:42.729400 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/stf-smoketest-smoke1-78t94" event={"ID":"bfb5a27e-f6df-44ba-8f68-446946410953","Type":"ContainerDied","Data":"694d068f0560de8c0ccec981067fb7a82ff8d6a56fdaddbf1e422c2f99d7ec45"} Mar 16 00:37:44 crc kubenswrapper[4816]: I0316 00:37:44.051597 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/stf-smoketest-smoke1-78t94" Mar 16 00:37:44 crc kubenswrapper[4816]: I0316 00:37:44.154086 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"collectd-entrypoint-script\" (UniqueName: \"kubernetes.io/configmap/bfb5a27e-f6df-44ba-8f68-446946410953-collectd-entrypoint-script\") pod \"bfb5a27e-f6df-44ba-8f68-446946410953\" (UID: \"bfb5a27e-f6df-44ba-8f68-446946410953\") " Mar 16 00:37:44 crc kubenswrapper[4816]: I0316 00:37:44.154153 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mj6gg\" (UniqueName: \"kubernetes.io/projected/bfb5a27e-f6df-44ba-8f68-446946410953-kube-api-access-mj6gg\") pod \"bfb5a27e-f6df-44ba-8f68-446946410953\" (UID: \"bfb5a27e-f6df-44ba-8f68-446946410953\") " Mar 16 00:37:44 crc kubenswrapper[4816]: I0316 00:37:44.154186 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-publisher\" (UniqueName: \"kubernetes.io/configmap/bfb5a27e-f6df-44ba-8f68-446946410953-ceilometer-publisher\") pod \"bfb5a27e-f6df-44ba-8f68-446946410953\" (UID: \"bfb5a27e-f6df-44ba-8f68-446946410953\") " Mar 16 00:37:44 crc kubenswrapper[4816]: I0316 00:37:44.154220 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"healthcheck-log\" (UniqueName: \"kubernetes.io/configmap/bfb5a27e-f6df-44ba-8f68-446946410953-healthcheck-log\") pod \"bfb5a27e-f6df-44ba-8f68-446946410953\" (UID: \"bfb5a27e-f6df-44ba-8f68-446946410953\") " Mar 16 00:37:44 crc kubenswrapper[4816]: I0316 00:37:44.154245 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sensubility-config\" (UniqueName: \"kubernetes.io/configmap/bfb5a27e-f6df-44ba-8f68-446946410953-sensubility-config\") pod \"bfb5a27e-f6df-44ba-8f68-446946410953\" (UID: \"bfb5a27e-f6df-44ba-8f68-446946410953\") " Mar 16 00:37:44 crc kubenswrapper[4816]: I0316 00:37:44.154287 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-entrypoint-script\" (UniqueName: \"kubernetes.io/configmap/bfb5a27e-f6df-44ba-8f68-446946410953-ceilometer-entrypoint-script\") pod \"bfb5a27e-f6df-44ba-8f68-446946410953\" (UID: \"bfb5a27e-f6df-44ba-8f68-446946410953\") " Mar 16 00:37:44 crc kubenswrapper[4816]: I0316 00:37:44.154347 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"collectd-config\" (UniqueName: \"kubernetes.io/configmap/bfb5a27e-f6df-44ba-8f68-446946410953-collectd-config\") pod \"bfb5a27e-f6df-44ba-8f68-446946410953\" (UID: \"bfb5a27e-f6df-44ba-8f68-446946410953\") " Mar 16 00:37:44 crc kubenswrapper[4816]: I0316 00:37:44.176218 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bfb5a27e-f6df-44ba-8f68-446946410953-collectd-entrypoint-script" (OuterVolumeSpecName: "collectd-entrypoint-script") pod "bfb5a27e-f6df-44ba-8f68-446946410953" (UID: "bfb5a27e-f6df-44ba-8f68-446946410953"). InnerVolumeSpecName "collectd-entrypoint-script". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:37:44 crc kubenswrapper[4816]: I0316 00:37:44.190732 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bfb5a27e-f6df-44ba-8f68-446946410953-kube-api-access-mj6gg" (OuterVolumeSpecName: "kube-api-access-mj6gg") pod "bfb5a27e-f6df-44ba-8f68-446946410953" (UID: "bfb5a27e-f6df-44ba-8f68-446946410953"). InnerVolumeSpecName "kube-api-access-mj6gg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:37:44 crc kubenswrapper[4816]: I0316 00:37:44.210980 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bfb5a27e-f6df-44ba-8f68-446946410953-ceilometer-publisher" (OuterVolumeSpecName: "ceilometer-publisher") pod "bfb5a27e-f6df-44ba-8f68-446946410953" (UID: "bfb5a27e-f6df-44ba-8f68-446946410953"). InnerVolumeSpecName "ceilometer-publisher". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:37:44 crc kubenswrapper[4816]: I0316 00:37:44.216065 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bfb5a27e-f6df-44ba-8f68-446946410953-sensubility-config" (OuterVolumeSpecName: "sensubility-config") pod "bfb5a27e-f6df-44ba-8f68-446946410953" (UID: "bfb5a27e-f6df-44ba-8f68-446946410953"). InnerVolumeSpecName "sensubility-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:37:44 crc kubenswrapper[4816]: I0316 00:37:44.224482 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bfb5a27e-f6df-44ba-8f68-446946410953-healthcheck-log" (OuterVolumeSpecName: "healthcheck-log") pod "bfb5a27e-f6df-44ba-8f68-446946410953" (UID: "bfb5a27e-f6df-44ba-8f68-446946410953"). InnerVolumeSpecName "healthcheck-log". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:37:44 crc kubenswrapper[4816]: I0316 00:37:44.235911 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bfb5a27e-f6df-44ba-8f68-446946410953-ceilometer-entrypoint-script" (OuterVolumeSpecName: "ceilometer-entrypoint-script") pod "bfb5a27e-f6df-44ba-8f68-446946410953" (UID: "bfb5a27e-f6df-44ba-8f68-446946410953"). InnerVolumeSpecName "ceilometer-entrypoint-script". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:37:44 crc kubenswrapper[4816]: I0316 00:37:44.247173 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bfb5a27e-f6df-44ba-8f68-446946410953-collectd-config" (OuterVolumeSpecName: "collectd-config") pod "bfb5a27e-f6df-44ba-8f68-446946410953" (UID: "bfb5a27e-f6df-44ba-8f68-446946410953"). InnerVolumeSpecName "collectd-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:37:44 crc kubenswrapper[4816]: I0316 00:37:44.256269 4816 reconciler_common.go:293] "Volume detached for volume \"collectd-entrypoint-script\" (UniqueName: \"kubernetes.io/configmap/bfb5a27e-f6df-44ba-8f68-446946410953-collectd-entrypoint-script\") on node \"crc\" DevicePath \"\"" Mar 16 00:37:44 crc kubenswrapper[4816]: I0316 00:37:44.256309 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mj6gg\" (UniqueName: \"kubernetes.io/projected/bfb5a27e-f6df-44ba-8f68-446946410953-kube-api-access-mj6gg\") on node \"crc\" DevicePath \"\"" Mar 16 00:37:44 crc kubenswrapper[4816]: I0316 00:37:44.256321 4816 reconciler_common.go:293] "Volume detached for volume \"ceilometer-publisher\" (UniqueName: \"kubernetes.io/configmap/bfb5a27e-f6df-44ba-8f68-446946410953-ceilometer-publisher\") on node \"crc\" DevicePath \"\"" Mar 16 00:37:44 crc kubenswrapper[4816]: I0316 00:37:44.256336 4816 reconciler_common.go:293] "Volume detached for volume \"healthcheck-log\" (UniqueName: \"kubernetes.io/configmap/bfb5a27e-f6df-44ba-8f68-446946410953-healthcheck-log\") on node \"crc\" DevicePath \"\"" Mar 16 00:37:44 crc kubenswrapper[4816]: I0316 00:37:44.256349 4816 reconciler_common.go:293] "Volume detached for volume \"sensubility-config\" (UniqueName: \"kubernetes.io/configmap/bfb5a27e-f6df-44ba-8f68-446946410953-sensubility-config\") on node \"crc\" DevicePath \"\"" Mar 16 00:37:44 crc kubenswrapper[4816]: I0316 00:37:44.256359 4816 reconciler_common.go:293] "Volume detached for volume \"ceilometer-entrypoint-script\" (UniqueName: \"kubernetes.io/configmap/bfb5a27e-f6df-44ba-8f68-446946410953-ceilometer-entrypoint-script\") on node \"crc\" DevicePath \"\"" Mar 16 00:37:44 crc kubenswrapper[4816]: I0316 00:37:44.256371 4816 reconciler_common.go:293] "Volume detached for volume \"collectd-config\" (UniqueName: \"kubernetes.io/configmap/bfb5a27e-f6df-44ba-8f68-446946410953-collectd-config\") on node \"crc\" DevicePath \"\"" Mar 16 00:37:44 crc kubenswrapper[4816]: I0316 00:37:44.746817 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/stf-smoketest-smoke1-78t94" event={"ID":"bfb5a27e-f6df-44ba-8f68-446946410953","Type":"ContainerDied","Data":"b874d2ee7cf3967963c00611ffab38c70d37853ea80cc511f1984da79b61eef1"} Mar 16 00:37:44 crc kubenswrapper[4816]: I0316 00:37:44.746863 4816 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b874d2ee7cf3967963c00611ffab38c70d37853ea80cc511f1984da79b61eef1" Mar 16 00:37:44 crc kubenswrapper[4816]: I0316 00:37:44.746913 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/stf-smoketest-smoke1-78t94" Mar 16 00:37:46 crc kubenswrapper[4816]: I0316 00:37:45.998521 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_stf-smoketest-smoke1-78t94_bfb5a27e-f6df-44ba-8f68-446946410953/smoketest-collectd/0.log" Mar 16 00:37:46 crc kubenswrapper[4816]: I0316 00:37:46.303539 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_stf-smoketest-smoke1-78t94_bfb5a27e-f6df-44ba-8f68-446946410953/smoketest-ceilometer/0.log" Mar 16 00:37:46 crc kubenswrapper[4816]: I0316 00:37:46.563110 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_default-interconnect-68864d46cb-pdx9k_99cc3f5f-4a76-4ef9-9001-dda6329b3fe0/default-interconnect/0.log" Mar 16 00:37:46 crc kubenswrapper[4816]: I0316 00:37:46.815425 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_default-cloud1-coll-meter-smartgateway-7cd87f9766-m8n7t_4de6f751-2471-4ce9-a771-00703e7be02a/bridge/2.log" Mar 16 00:37:47 crc kubenswrapper[4816]: I0316 00:37:47.067469 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_default-cloud1-coll-meter-smartgateway-7cd87f9766-m8n7t_4de6f751-2471-4ce9-a771-00703e7be02a/sg-core/0.log" Mar 16 00:37:47 crc kubenswrapper[4816]: I0316 00:37:47.346340 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_default-cloud1-coll-event-smartgateway-598dc6844-7wkmc_71ca3af9-1a2f-4bd2-898a-13c9089b16c2/bridge/2.log" Mar 16 00:37:47 crc kubenswrapper[4816]: I0316 00:37:47.611326 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_default-cloud1-coll-event-smartgateway-598dc6844-7wkmc_71ca3af9-1a2f-4bd2-898a-13c9089b16c2/sg-core/0.log" Mar 16 00:37:47 crc kubenswrapper[4816]: I0316 00:37:47.891616 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_default-cloud1-ceil-meter-smartgateway-57948895dc-gjt8v_a7c7b38e-dd7e-469c-ab38-173944ca2943/bridge/2.log" Mar 16 00:37:48 crc kubenswrapper[4816]: I0316 00:37:48.183089 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_default-cloud1-ceil-meter-smartgateway-57948895dc-gjt8v_a7c7b38e-dd7e-469c-ab38-173944ca2943/sg-core/0.log" Mar 16 00:37:48 crc kubenswrapper[4816]: I0316 00:37:48.465610 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_default-cloud1-ceil-event-smartgateway-6cc4dc6457-55sts_134459cc-413b-4996-a0c1-aafe8dae8ebb/bridge/2.log" Mar 16 00:37:48 crc kubenswrapper[4816]: I0316 00:37:48.770145 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_default-cloud1-ceil-event-smartgateway-6cc4dc6457-55sts_134459cc-413b-4996-a0c1-aafe8dae8ebb/sg-core/0.log" Mar 16 00:37:49 crc kubenswrapper[4816]: I0316 00:37:49.006794 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_default-cloud1-sens-meter-smartgateway-5759b4d97-lr826_ee673348-980c-44f5-8e33-71a859ce740c/bridge/2.log" Mar 16 00:37:49 crc kubenswrapper[4816]: I0316 00:37:49.266201 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_default-cloud1-sens-meter-smartgateway-5759b4d97-lr826_ee673348-980c-44f5-8e33-71a859ce740c/sg-core/0.log" Mar 16 00:37:50 crc kubenswrapper[4816]: I0316 00:37:50.667506 4816 scope.go:117] "RemoveContainer" containerID="ad2d141490ffe2b1f91c3a2296efeb65e936382d1300bb9398efe83e779ccd8a" Mar 16 00:37:50 crc kubenswrapper[4816]: E0316 00:37:50.667789 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jrdcz_openshift-machine-config-operator(dd08ece2-7636-4966-973a-e96a34b70b53)\"" pod="openshift-machine-config-operator/machine-config-daemon-jrdcz" podUID="dd08ece2-7636-4966-973a-e96a34b70b53" Mar 16 00:37:52 crc kubenswrapper[4816]: I0316 00:37:52.733153 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_smart-gateway-operator-55c8479bdf-6m44w_e96079bc-73ba-420e-9568-cea10077c4ae/operator/0.log" Mar 16 00:37:52 crc kubenswrapper[4816]: I0316 00:37:52.976944 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_prometheus-default-0_078376fd-a0f8-4157-8a07-23ce85695dc6/prometheus/0.log" Mar 16 00:37:53 crc kubenswrapper[4816]: I0316 00:37:53.234257 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_elasticsearch-es-default-0_819af9fc-6db9-4743-bd06-f844f5ef5b0d/elasticsearch/0.log" Mar 16 00:37:53 crc kubenswrapper[4816]: I0316 00:37:53.492338 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_default-snmp-webhook-6856cfb745-2n6hc_5be5ca83-5116-48dc-8d6c-733cbd3e9682/prometheus-webhook-snmp/0.log" Mar 16 00:37:53 crc kubenswrapper[4816]: I0316 00:37:53.732772 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_alertmanager-default-0_f4698c34-c93e-4d6f-8ab8-2bfcf3118410/alertmanager/0.log" Mar 16 00:38:00 crc kubenswrapper[4816]: I0316 00:38:00.150130 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29560358-dc9b5"] Mar 16 00:38:00 crc kubenswrapper[4816]: E0316 00:38:00.151106 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bfb5a27e-f6df-44ba-8f68-446946410953" containerName="smoketest-ceilometer" Mar 16 00:38:00 crc kubenswrapper[4816]: I0316 00:38:00.151126 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="bfb5a27e-f6df-44ba-8f68-446946410953" containerName="smoketest-ceilometer" Mar 16 00:38:00 crc kubenswrapper[4816]: E0316 00:38:00.151199 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a5b0de9a-0a4c-468c-b49f-575ecbc053e3" containerName="curl" Mar 16 00:38:00 crc kubenswrapper[4816]: I0316 00:38:00.151209 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="a5b0de9a-0a4c-468c-b49f-575ecbc053e3" containerName="curl" Mar 16 00:38:00 crc kubenswrapper[4816]: E0316 00:38:00.151229 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bfb5a27e-f6df-44ba-8f68-446946410953" containerName="smoketest-collectd" Mar 16 00:38:00 crc kubenswrapper[4816]: I0316 00:38:00.151240 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="bfb5a27e-f6df-44ba-8f68-446946410953" containerName="smoketest-collectd" Mar 16 00:38:00 crc kubenswrapper[4816]: I0316 00:38:00.151416 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="bfb5a27e-f6df-44ba-8f68-446946410953" containerName="smoketest-ceilometer" Mar 16 00:38:00 crc kubenswrapper[4816]: I0316 00:38:00.151437 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="bfb5a27e-f6df-44ba-8f68-446946410953" containerName="smoketest-collectd" Mar 16 00:38:00 crc kubenswrapper[4816]: I0316 00:38:00.151453 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="a5b0de9a-0a4c-468c-b49f-575ecbc053e3" containerName="curl" Mar 16 00:38:00 crc kubenswrapper[4816]: I0316 00:38:00.153172 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29560358-dc9b5" Mar 16 00:38:00 crc kubenswrapper[4816]: I0316 00:38:00.156063 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 16 00:38:00 crc kubenswrapper[4816]: I0316 00:38:00.156103 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 16 00:38:00 crc kubenswrapper[4816]: I0316 00:38:00.158313 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29560358-dc9b5"] Mar 16 00:38:00 crc kubenswrapper[4816]: I0316 00:38:00.163114 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-8hc2r" Mar 16 00:38:00 crc kubenswrapper[4816]: I0316 00:38:00.186619 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nvgsn\" (UniqueName: \"kubernetes.io/projected/f515fa39-31b2-47d2-b50d-d7b95a1cf7a1-kube-api-access-nvgsn\") pod \"auto-csr-approver-29560358-dc9b5\" (UID: \"f515fa39-31b2-47d2-b50d-d7b95a1cf7a1\") " pod="openshift-infra/auto-csr-approver-29560358-dc9b5" Mar 16 00:38:00 crc kubenswrapper[4816]: I0316 00:38:00.288825 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nvgsn\" (UniqueName: \"kubernetes.io/projected/f515fa39-31b2-47d2-b50d-d7b95a1cf7a1-kube-api-access-nvgsn\") pod \"auto-csr-approver-29560358-dc9b5\" (UID: \"f515fa39-31b2-47d2-b50d-d7b95a1cf7a1\") " pod="openshift-infra/auto-csr-approver-29560358-dc9b5" Mar 16 00:38:00 crc kubenswrapper[4816]: I0316 00:38:00.307497 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nvgsn\" (UniqueName: \"kubernetes.io/projected/f515fa39-31b2-47d2-b50d-d7b95a1cf7a1-kube-api-access-nvgsn\") pod \"auto-csr-approver-29560358-dc9b5\" (UID: \"f515fa39-31b2-47d2-b50d-d7b95a1cf7a1\") " pod="openshift-infra/auto-csr-approver-29560358-dc9b5" Mar 16 00:38:00 crc kubenswrapper[4816]: I0316 00:38:00.475795 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29560358-dc9b5" Mar 16 00:38:00 crc kubenswrapper[4816]: I0316 00:38:00.911596 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29560358-dc9b5"] Mar 16 00:38:01 crc kubenswrapper[4816]: I0316 00:38:01.904870 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29560358-dc9b5" event={"ID":"f515fa39-31b2-47d2-b50d-d7b95a1cf7a1","Type":"ContainerStarted","Data":"cff1a249957940ab78d95b02deb4fde3f8fbd654950d4bd438fb1337cb77b306"} Mar 16 00:38:02 crc kubenswrapper[4816]: I0316 00:38:02.914830 4816 generic.go:334] "Generic (PLEG): container finished" podID="f515fa39-31b2-47d2-b50d-d7b95a1cf7a1" containerID="a277fc9eef3079702bbb7065b00f7756e2712612c3ceca27a724c44263f7e5ac" exitCode=0 Mar 16 00:38:02 crc kubenswrapper[4816]: I0316 00:38:02.914902 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29560358-dc9b5" event={"ID":"f515fa39-31b2-47d2-b50d-d7b95a1cf7a1","Type":"ContainerDied","Data":"a277fc9eef3079702bbb7065b00f7756e2712612c3ceca27a724c44263f7e5ac"} Mar 16 00:38:04 crc kubenswrapper[4816]: I0316 00:38:04.169176 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29560358-dc9b5" Mar 16 00:38:04 crc kubenswrapper[4816]: I0316 00:38:04.259910 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nvgsn\" (UniqueName: \"kubernetes.io/projected/f515fa39-31b2-47d2-b50d-d7b95a1cf7a1-kube-api-access-nvgsn\") pod \"f515fa39-31b2-47d2-b50d-d7b95a1cf7a1\" (UID: \"f515fa39-31b2-47d2-b50d-d7b95a1cf7a1\") " Mar 16 00:38:04 crc kubenswrapper[4816]: I0316 00:38:04.276338 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f515fa39-31b2-47d2-b50d-d7b95a1cf7a1-kube-api-access-nvgsn" (OuterVolumeSpecName: "kube-api-access-nvgsn") pod "f515fa39-31b2-47d2-b50d-d7b95a1cf7a1" (UID: "f515fa39-31b2-47d2-b50d-d7b95a1cf7a1"). InnerVolumeSpecName "kube-api-access-nvgsn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:38:04 crc kubenswrapper[4816]: I0316 00:38:04.361844 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nvgsn\" (UniqueName: \"kubernetes.io/projected/f515fa39-31b2-47d2-b50d-d7b95a1cf7a1-kube-api-access-nvgsn\") on node \"crc\" DevicePath \"\"" Mar 16 00:38:04 crc kubenswrapper[4816]: I0316 00:38:04.667803 4816 scope.go:117] "RemoveContainer" containerID="ad2d141490ffe2b1f91c3a2296efeb65e936382d1300bb9398efe83e779ccd8a" Mar 16 00:38:04 crc kubenswrapper[4816]: E0316 00:38:04.668189 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jrdcz_openshift-machine-config-operator(dd08ece2-7636-4966-973a-e96a34b70b53)\"" pod="openshift-machine-config-operator/machine-config-daemon-jrdcz" podUID="dd08ece2-7636-4966-973a-e96a34b70b53" Mar 16 00:38:04 crc kubenswrapper[4816]: I0316 00:38:04.932582 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29560358-dc9b5" event={"ID":"f515fa39-31b2-47d2-b50d-d7b95a1cf7a1","Type":"ContainerDied","Data":"cff1a249957940ab78d95b02deb4fde3f8fbd654950d4bd438fb1337cb77b306"} Mar 16 00:38:04 crc kubenswrapper[4816]: I0316 00:38:04.932618 4816 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cff1a249957940ab78d95b02deb4fde3f8fbd654950d4bd438fb1337cb77b306" Mar 16 00:38:04 crc kubenswrapper[4816]: I0316 00:38:04.932619 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29560358-dc9b5" Mar 16 00:38:05 crc kubenswrapper[4816]: I0316 00:38:05.231296 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29560352-4c2cj"] Mar 16 00:38:05 crc kubenswrapper[4816]: I0316 00:38:05.237095 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29560352-4c2cj"] Mar 16 00:38:05 crc kubenswrapper[4816]: I0316 00:38:05.677238 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cee5a2cc-4256-43fb-9517-83533a5acf29" path="/var/lib/kubelet/pods/cee5a2cc-4256-43fb-9517-83533a5acf29/volumes" Mar 16 00:38:09 crc kubenswrapper[4816]: I0316 00:38:09.687756 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_service-telemetry-operator-7dbcddcc6f-lqrgz_0faefde0-6740-414f-bf47-0d763a35b22f/operator/0.log" Mar 16 00:38:12 crc kubenswrapper[4816]: I0316 00:38:12.998664 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_smart-gateway-operator-55c8479bdf-6m44w_e96079bc-73ba-420e-9568-cea10077c4ae/operator/0.log" Mar 16 00:38:13 crc kubenswrapper[4816]: I0316 00:38:13.320307 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_qdr-test_e014dd01-f826-4642-bb43-dbdab4a1e503/qdr/0.log" Mar 16 00:38:15 crc kubenswrapper[4816]: I0316 00:38:15.668335 4816 scope.go:117] "RemoveContainer" containerID="ad2d141490ffe2b1f91c3a2296efeb65e936382d1300bb9398efe83e779ccd8a" Mar 16 00:38:15 crc kubenswrapper[4816]: E0316 00:38:15.669138 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jrdcz_openshift-machine-config-operator(dd08ece2-7636-4966-973a-e96a34b70b53)\"" pod="openshift-machine-config-operator/machine-config-daemon-jrdcz" podUID="dd08ece2-7636-4966-973a-e96a34b70b53" Mar 16 00:38:29 crc kubenswrapper[4816]: I0316 00:38:29.668206 4816 scope.go:117] "RemoveContainer" containerID="ad2d141490ffe2b1f91c3a2296efeb65e936382d1300bb9398efe83e779ccd8a" Mar 16 00:38:29 crc kubenswrapper[4816]: E0316 00:38:29.669441 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jrdcz_openshift-machine-config-operator(dd08ece2-7636-4966-973a-e96a34b70b53)\"" pod="openshift-machine-config-operator/machine-config-daemon-jrdcz" podUID="dd08ece2-7636-4966-973a-e96a34b70b53" Mar 16 00:38:39 crc kubenswrapper[4816]: I0316 00:38:39.287845 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-75vc8/must-gather-jdjt6"] Mar 16 00:38:39 crc kubenswrapper[4816]: E0316 00:38:39.288771 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f515fa39-31b2-47d2-b50d-d7b95a1cf7a1" containerName="oc" Mar 16 00:38:39 crc kubenswrapper[4816]: I0316 00:38:39.288791 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="f515fa39-31b2-47d2-b50d-d7b95a1cf7a1" containerName="oc" Mar 16 00:38:39 crc kubenswrapper[4816]: I0316 00:38:39.288981 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="f515fa39-31b2-47d2-b50d-d7b95a1cf7a1" containerName="oc" Mar 16 00:38:39 crc kubenswrapper[4816]: I0316 00:38:39.290065 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-75vc8/must-gather-jdjt6" Mar 16 00:38:39 crc kubenswrapper[4816]: I0316 00:38:39.292456 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-75vc8"/"default-dockercfg-8s7pv" Mar 16 00:38:39 crc kubenswrapper[4816]: I0316 00:38:39.292654 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-75vc8"/"kube-root-ca.crt" Mar 16 00:38:39 crc kubenswrapper[4816]: I0316 00:38:39.292757 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-75vc8"/"openshift-service-ca.crt" Mar 16 00:38:39 crc kubenswrapper[4816]: I0316 00:38:39.313865 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4x46w\" (UniqueName: \"kubernetes.io/projected/09265502-9e41-4783-ad8d-206a0b0372c8-kube-api-access-4x46w\") pod \"must-gather-jdjt6\" (UID: \"09265502-9e41-4783-ad8d-206a0b0372c8\") " pod="openshift-must-gather-75vc8/must-gather-jdjt6" Mar 16 00:38:39 crc kubenswrapper[4816]: I0316 00:38:39.314210 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/09265502-9e41-4783-ad8d-206a0b0372c8-must-gather-output\") pod \"must-gather-jdjt6\" (UID: \"09265502-9e41-4783-ad8d-206a0b0372c8\") " pod="openshift-must-gather-75vc8/must-gather-jdjt6" Mar 16 00:38:39 crc kubenswrapper[4816]: I0316 00:38:39.353089 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-75vc8/must-gather-jdjt6"] Mar 16 00:38:39 crc kubenswrapper[4816]: I0316 00:38:39.415683 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/09265502-9e41-4783-ad8d-206a0b0372c8-must-gather-output\") pod \"must-gather-jdjt6\" (UID: \"09265502-9e41-4783-ad8d-206a0b0372c8\") " pod="openshift-must-gather-75vc8/must-gather-jdjt6" Mar 16 00:38:39 crc kubenswrapper[4816]: I0316 00:38:39.415772 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4x46w\" (UniqueName: \"kubernetes.io/projected/09265502-9e41-4783-ad8d-206a0b0372c8-kube-api-access-4x46w\") pod \"must-gather-jdjt6\" (UID: \"09265502-9e41-4783-ad8d-206a0b0372c8\") " pod="openshift-must-gather-75vc8/must-gather-jdjt6" Mar 16 00:38:39 crc kubenswrapper[4816]: I0316 00:38:39.416246 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/09265502-9e41-4783-ad8d-206a0b0372c8-must-gather-output\") pod \"must-gather-jdjt6\" (UID: \"09265502-9e41-4783-ad8d-206a0b0372c8\") " pod="openshift-must-gather-75vc8/must-gather-jdjt6" Mar 16 00:38:39 crc kubenswrapper[4816]: I0316 00:38:39.435660 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4x46w\" (UniqueName: \"kubernetes.io/projected/09265502-9e41-4783-ad8d-206a0b0372c8-kube-api-access-4x46w\") pod \"must-gather-jdjt6\" (UID: \"09265502-9e41-4783-ad8d-206a0b0372c8\") " pod="openshift-must-gather-75vc8/must-gather-jdjt6" Mar 16 00:38:39 crc kubenswrapper[4816]: I0316 00:38:39.606543 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-75vc8/must-gather-jdjt6" Mar 16 00:38:39 crc kubenswrapper[4816]: I0316 00:38:39.860206 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-75vc8/must-gather-jdjt6"] Mar 16 00:38:40 crc kubenswrapper[4816]: I0316 00:38:40.251676 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-75vc8/must-gather-jdjt6" event={"ID":"09265502-9e41-4783-ad8d-206a0b0372c8","Type":"ContainerStarted","Data":"dea01f9b295d86a7b4b3bc5abc7b94cb38d7e5aae51dde99d686b6fd876042de"} Mar 16 00:38:43 crc kubenswrapper[4816]: I0316 00:38:43.667763 4816 scope.go:117] "RemoveContainer" containerID="ad2d141490ffe2b1f91c3a2296efeb65e936382d1300bb9398efe83e779ccd8a" Mar 16 00:38:43 crc kubenswrapper[4816]: E0316 00:38:43.668209 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jrdcz_openshift-machine-config-operator(dd08ece2-7636-4966-973a-e96a34b70b53)\"" pod="openshift-machine-config-operator/machine-config-daemon-jrdcz" podUID="dd08ece2-7636-4966-973a-e96a34b70b53" Mar 16 00:38:46 crc kubenswrapper[4816]: I0316 00:38:46.303254 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-75vc8/must-gather-jdjt6" event={"ID":"09265502-9e41-4783-ad8d-206a0b0372c8","Type":"ContainerStarted","Data":"494b4782083857fe979ca56945f86f2f0ad0c22c9919d181363fe40238962eaa"} Mar 16 00:38:47 crc kubenswrapper[4816]: I0316 00:38:47.314447 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-75vc8/must-gather-jdjt6" event={"ID":"09265502-9e41-4783-ad8d-206a0b0372c8","Type":"ContainerStarted","Data":"8d1fe728d192a689b113e6ab30597764dabd8f0121d2a1200332da846701716a"} Mar 16 00:38:47 crc kubenswrapper[4816]: I0316 00:38:47.340071 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-75vc8/must-gather-jdjt6" podStartSLOduration=2.183096339 podStartE2EDuration="8.340051869s" podCreationTimestamp="2026-03-16 00:38:39 +0000 UTC" firstStartedPulling="2026-03-16 00:38:39.869782277 +0000 UTC m=+1912.966082230" lastFinishedPulling="2026-03-16 00:38:46.026737807 +0000 UTC m=+1919.123037760" observedRunningTime="2026-03-16 00:38:47.33409119 +0000 UTC m=+1920.430391183" watchObservedRunningTime="2026-03-16 00:38:47.340051869 +0000 UTC m=+1920.436351822" Mar 16 00:38:56 crc kubenswrapper[4816]: I0316 00:38:56.667845 4816 scope.go:117] "RemoveContainer" containerID="ad2d141490ffe2b1f91c3a2296efeb65e936382d1300bb9398efe83e779ccd8a" Mar 16 00:38:56 crc kubenswrapper[4816]: E0316 00:38:56.668695 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jrdcz_openshift-machine-config-operator(dd08ece2-7636-4966-973a-e96a34b70b53)\"" pod="openshift-machine-config-operator/machine-config-daemon-jrdcz" podUID="dd08ece2-7636-4966-973a-e96a34b70b53" Mar 16 00:39:05 crc kubenswrapper[4816]: I0316 00:39:05.498047 4816 scope.go:117] "RemoveContainer" containerID="e30d5325e2ed5b1448d06f7c2b9b149b118aecd27db13855743e289187e36f13" Mar 16 00:39:05 crc kubenswrapper[4816]: I0316 00:39:05.542131 4816 scope.go:117] "RemoveContainer" containerID="d67550a0f7d36330df42139700602ca21750bc1c1f58bc1cc3a210d0f86409d3" Mar 16 00:39:05 crc kubenswrapper[4816]: I0316 00:39:05.582594 4816 scope.go:117] "RemoveContainer" containerID="e4439cd35f13a68a04a6b45eaa00f3aeec10afe4bbea233d056d60648e32f1e4" Mar 16 00:39:05 crc kubenswrapper[4816]: I0316 00:39:05.602856 4816 scope.go:117] "RemoveContainer" containerID="520f1908678615d0e5b73ebdbbe6a48ebe9c84b2afda6fc6f03c6d31b9a2fb39" Mar 16 00:39:05 crc kubenswrapper[4816]: I0316 00:39:05.630215 4816 scope.go:117] "RemoveContainer" containerID="2169e8fca36c31b741a4793cc4a50c325f1ec3d6141a69fbe357f1c522080d5b" Mar 16 00:39:08 crc kubenswrapper[4816]: I0316 00:39:08.668524 4816 scope.go:117] "RemoveContainer" containerID="ad2d141490ffe2b1f91c3a2296efeb65e936382d1300bb9398efe83e779ccd8a" Mar 16 00:39:08 crc kubenswrapper[4816]: E0316 00:39:08.669127 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jrdcz_openshift-machine-config-operator(dd08ece2-7636-4966-973a-e96a34b70b53)\"" pod="openshift-machine-config-operator/machine-config-daemon-jrdcz" podUID="dd08ece2-7636-4966-973a-e96a34b70b53" Mar 16 00:39:21 crc kubenswrapper[4816]: I0316 00:39:21.668319 4816 scope.go:117] "RemoveContainer" containerID="ad2d141490ffe2b1f91c3a2296efeb65e936382d1300bb9398efe83e779ccd8a" Mar 16 00:39:21 crc kubenswrapper[4816]: E0316 00:39:21.669105 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jrdcz_openshift-machine-config-operator(dd08ece2-7636-4966-973a-e96a34b70b53)\"" pod="openshift-machine-config-operator/machine-config-daemon-jrdcz" podUID="dd08ece2-7636-4966-973a-e96a34b70b53" Mar 16 00:39:24 crc kubenswrapper[4816]: I0316 00:39:24.475451 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/infrawatch-operators-jg5tv"] Mar 16 00:39:24 crc kubenswrapper[4816]: I0316 00:39:24.476436 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/infrawatch-operators-jg5tv" Mar 16 00:39:24 crc kubenswrapper[4816]: I0316 00:39:24.491242 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/infrawatch-operators-jg5tv"] Mar 16 00:39:24 crc kubenswrapper[4816]: I0316 00:39:24.517541 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x2j9c\" (UniqueName: \"kubernetes.io/projected/35c26e16-0aa9-4252-995c-d34bd14ed1a9-kube-api-access-x2j9c\") pod \"infrawatch-operators-jg5tv\" (UID: \"35c26e16-0aa9-4252-995c-d34bd14ed1a9\") " pod="service-telemetry/infrawatch-operators-jg5tv" Mar 16 00:39:24 crc kubenswrapper[4816]: I0316 00:39:24.618701 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x2j9c\" (UniqueName: \"kubernetes.io/projected/35c26e16-0aa9-4252-995c-d34bd14ed1a9-kube-api-access-x2j9c\") pod \"infrawatch-operators-jg5tv\" (UID: \"35c26e16-0aa9-4252-995c-d34bd14ed1a9\") " pod="service-telemetry/infrawatch-operators-jg5tv" Mar 16 00:39:24 crc kubenswrapper[4816]: I0316 00:39:24.656780 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x2j9c\" (UniqueName: \"kubernetes.io/projected/35c26e16-0aa9-4252-995c-d34bd14ed1a9-kube-api-access-x2j9c\") pod \"infrawatch-operators-jg5tv\" (UID: \"35c26e16-0aa9-4252-995c-d34bd14ed1a9\") " pod="service-telemetry/infrawatch-operators-jg5tv" Mar 16 00:39:24 crc kubenswrapper[4816]: I0316 00:39:24.793062 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/infrawatch-operators-jg5tv" Mar 16 00:39:25 crc kubenswrapper[4816]: I0316 00:39:25.037982 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/infrawatch-operators-jg5tv"] Mar 16 00:39:25 crc kubenswrapper[4816]: I0316 00:39:25.044241 4816 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 16 00:39:25 crc kubenswrapper[4816]: I0316 00:39:25.621301 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/infrawatch-operators-jg5tv" event={"ID":"35c26e16-0aa9-4252-995c-d34bd14ed1a9","Type":"ContainerStarted","Data":"7063f0384c727fd195a24f4d4b1521ae98f4a83606caa980a4cdb71dd42e36bd"} Mar 16 00:39:25 crc kubenswrapper[4816]: I0316 00:39:25.621620 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/infrawatch-operators-jg5tv" event={"ID":"35c26e16-0aa9-4252-995c-d34bd14ed1a9","Type":"ContainerStarted","Data":"5cfc0be3c97d0c2aeedde44df26a4971ceee6aafb9c2f7efbcbc89b8db40d266"} Mar 16 00:39:25 crc kubenswrapper[4816]: I0316 00:39:25.640327 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/infrawatch-operators-jg5tv" podStartSLOduration=1.528942855 podStartE2EDuration="1.640310614s" podCreationTimestamp="2026-03-16 00:39:24 +0000 UTC" firstStartedPulling="2026-03-16 00:39:25.043943713 +0000 UTC m=+1958.140243676" lastFinishedPulling="2026-03-16 00:39:25.155311482 +0000 UTC m=+1958.251611435" observedRunningTime="2026-03-16 00:39:25.635418986 +0000 UTC m=+1958.731718949" watchObservedRunningTime="2026-03-16 00:39:25.640310614 +0000 UTC m=+1958.736610567" Mar 16 00:39:30 crc kubenswrapper[4816]: I0316 00:39:30.506914 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-mwhpz_dc6dfded-ec9e-4a6f-a97b-3bb4cf8a149a/control-plane-machine-set-operator/0.log" Mar 16 00:39:30 crc kubenswrapper[4816]: I0316 00:39:30.644764 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-9xv4p_1306b657-0022-435d-bb72-793f1c1a106b/kube-rbac-proxy/0.log" Mar 16 00:39:30 crc kubenswrapper[4816]: I0316 00:39:30.680501 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-9xv4p_1306b657-0022-435d-bb72-793f1c1a106b/machine-api-operator/0.log" Mar 16 00:39:33 crc kubenswrapper[4816]: I0316 00:39:33.668256 4816 scope.go:117] "RemoveContainer" containerID="ad2d141490ffe2b1f91c3a2296efeb65e936382d1300bb9398efe83e779ccd8a" Mar 16 00:39:33 crc kubenswrapper[4816]: E0316 00:39:33.668975 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jrdcz_openshift-machine-config-operator(dd08ece2-7636-4966-973a-e96a34b70b53)\"" pod="openshift-machine-config-operator/machine-config-daemon-jrdcz" podUID="dd08ece2-7636-4966-973a-e96a34b70b53" Mar 16 00:39:34 crc kubenswrapper[4816]: I0316 00:39:34.795229 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="service-telemetry/infrawatch-operators-jg5tv" Mar 16 00:39:34 crc kubenswrapper[4816]: I0316 00:39:34.795276 4816 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="service-telemetry/infrawatch-operators-jg5tv" Mar 16 00:39:34 crc kubenswrapper[4816]: I0316 00:39:34.838639 4816 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="service-telemetry/infrawatch-operators-jg5tv" Mar 16 00:39:35 crc kubenswrapper[4816]: I0316 00:39:35.730413 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="service-telemetry/infrawatch-operators-jg5tv" Mar 16 00:39:35 crc kubenswrapper[4816]: I0316 00:39:35.785687 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["service-telemetry/infrawatch-operators-jg5tv"] Mar 16 00:39:37 crc kubenswrapper[4816]: I0316 00:39:37.713959 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="service-telemetry/infrawatch-operators-jg5tv" podUID="35c26e16-0aa9-4252-995c-d34bd14ed1a9" containerName="registry-server" containerID="cri-o://7063f0384c727fd195a24f4d4b1521ae98f4a83606caa980a4cdb71dd42e36bd" gracePeriod=2 Mar 16 00:39:38 crc kubenswrapper[4816]: I0316 00:39:38.164502 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/infrawatch-operators-jg5tv" Mar 16 00:39:38 crc kubenswrapper[4816]: I0316 00:39:38.225508 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x2j9c\" (UniqueName: \"kubernetes.io/projected/35c26e16-0aa9-4252-995c-d34bd14ed1a9-kube-api-access-x2j9c\") pod \"35c26e16-0aa9-4252-995c-d34bd14ed1a9\" (UID: \"35c26e16-0aa9-4252-995c-d34bd14ed1a9\") " Mar 16 00:39:38 crc kubenswrapper[4816]: I0316 00:39:38.234495 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/35c26e16-0aa9-4252-995c-d34bd14ed1a9-kube-api-access-x2j9c" (OuterVolumeSpecName: "kube-api-access-x2j9c") pod "35c26e16-0aa9-4252-995c-d34bd14ed1a9" (UID: "35c26e16-0aa9-4252-995c-d34bd14ed1a9"). InnerVolumeSpecName "kube-api-access-x2j9c". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:39:38 crc kubenswrapper[4816]: I0316 00:39:38.327281 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x2j9c\" (UniqueName: \"kubernetes.io/projected/35c26e16-0aa9-4252-995c-d34bd14ed1a9-kube-api-access-x2j9c\") on node \"crc\" DevicePath \"\"" Mar 16 00:39:38 crc kubenswrapper[4816]: I0316 00:39:38.722927 4816 generic.go:334] "Generic (PLEG): container finished" podID="35c26e16-0aa9-4252-995c-d34bd14ed1a9" containerID="7063f0384c727fd195a24f4d4b1521ae98f4a83606caa980a4cdb71dd42e36bd" exitCode=0 Mar 16 00:39:38 crc kubenswrapper[4816]: I0316 00:39:38.722967 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/infrawatch-operators-jg5tv" event={"ID":"35c26e16-0aa9-4252-995c-d34bd14ed1a9","Type":"ContainerDied","Data":"7063f0384c727fd195a24f4d4b1521ae98f4a83606caa980a4cdb71dd42e36bd"} Mar 16 00:39:38 crc kubenswrapper[4816]: I0316 00:39:38.722996 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/infrawatch-operators-jg5tv" event={"ID":"35c26e16-0aa9-4252-995c-d34bd14ed1a9","Type":"ContainerDied","Data":"5cfc0be3c97d0c2aeedde44df26a4971ceee6aafb9c2f7efbcbc89b8db40d266"} Mar 16 00:39:38 crc kubenswrapper[4816]: I0316 00:39:38.723016 4816 scope.go:117] "RemoveContainer" containerID="7063f0384c727fd195a24f4d4b1521ae98f4a83606caa980a4cdb71dd42e36bd" Mar 16 00:39:38 crc kubenswrapper[4816]: I0316 00:39:38.723028 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/infrawatch-operators-jg5tv" Mar 16 00:39:38 crc kubenswrapper[4816]: I0316 00:39:38.748201 4816 scope.go:117] "RemoveContainer" containerID="7063f0384c727fd195a24f4d4b1521ae98f4a83606caa980a4cdb71dd42e36bd" Mar 16 00:39:38 crc kubenswrapper[4816]: E0316 00:39:38.748748 4816 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7063f0384c727fd195a24f4d4b1521ae98f4a83606caa980a4cdb71dd42e36bd\": container with ID starting with 7063f0384c727fd195a24f4d4b1521ae98f4a83606caa980a4cdb71dd42e36bd not found: ID does not exist" containerID="7063f0384c727fd195a24f4d4b1521ae98f4a83606caa980a4cdb71dd42e36bd" Mar 16 00:39:38 crc kubenswrapper[4816]: I0316 00:39:38.748784 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7063f0384c727fd195a24f4d4b1521ae98f4a83606caa980a4cdb71dd42e36bd"} err="failed to get container status \"7063f0384c727fd195a24f4d4b1521ae98f4a83606caa980a4cdb71dd42e36bd\": rpc error: code = NotFound desc = could not find container \"7063f0384c727fd195a24f4d4b1521ae98f4a83606caa980a4cdb71dd42e36bd\": container with ID starting with 7063f0384c727fd195a24f4d4b1521ae98f4a83606caa980a4cdb71dd42e36bd not found: ID does not exist" Mar 16 00:39:38 crc kubenswrapper[4816]: I0316 00:39:38.758718 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["service-telemetry/infrawatch-operators-jg5tv"] Mar 16 00:39:38 crc kubenswrapper[4816]: I0316 00:39:38.767852 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["service-telemetry/infrawatch-operators-jg5tv"] Mar 16 00:39:39 crc kubenswrapper[4816]: I0316 00:39:39.676869 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="35c26e16-0aa9-4252-995c-d34bd14ed1a9" path="/var/lib/kubelet/pods/35c26e16-0aa9-4252-995c-d34bd14ed1a9/volumes" Mar 16 00:39:44 crc kubenswrapper[4816]: I0316 00:39:44.145516 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-545d4d4674-9q9nz_88d51e1b-a795-4157-82b4-8a74d228e698/cert-manager-controller/0.log" Mar 16 00:39:44 crc kubenswrapper[4816]: I0316 00:39:44.294993 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-5545bd876-25jvg_fe81d263-aafd-4bdb-a088-d4bc52592a2d/cert-manager-cainjector/0.log" Mar 16 00:39:44 crc kubenswrapper[4816]: I0316 00:39:44.379049 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-6888856db4-ssr4q_ca67da37-05ff-4b13-aeea-04ac7f17ffc0/cert-manager-webhook/0.log" Mar 16 00:39:44 crc kubenswrapper[4816]: I0316 00:39:44.667977 4816 scope.go:117] "RemoveContainer" containerID="ad2d141490ffe2b1f91c3a2296efeb65e936382d1300bb9398efe83e779ccd8a" Mar 16 00:39:44 crc kubenswrapper[4816]: E0316 00:39:44.668237 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jrdcz_openshift-machine-config-operator(dd08ece2-7636-4966-973a-e96a34b70b53)\"" pod="openshift-machine-config-operator/machine-config-daemon-jrdcz" podUID="dd08ece2-7636-4966-973a-e96a34b70b53" Mar 16 00:39:56 crc kubenswrapper[4816]: I0316 00:39:56.667627 4816 scope.go:117] "RemoveContainer" containerID="ad2d141490ffe2b1f91c3a2296efeb65e936382d1300bb9398efe83e779ccd8a" Mar 16 00:39:56 crc kubenswrapper[4816]: E0316 00:39:56.668583 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jrdcz_openshift-machine-config-operator(dd08ece2-7636-4966-973a-e96a34b70b53)\"" pod="openshift-machine-config-operator/machine-config-daemon-jrdcz" podUID="dd08ece2-7636-4966-973a-e96a34b70b53" Mar 16 00:39:59 crc kubenswrapper[4816]: I0316 00:39:59.697198 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-68bc856cb9-tfv44_562f24fe-5c4c-4540-96ae-6e01f539141b/prometheus-operator/0.log" Mar 16 00:39:59 crc kubenswrapper[4816]: I0316 00:39:59.866625 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-6d8d794bc-2s9sk_9a808114-3164-4abe-a481-1b5d3b9df2a0/prometheus-operator-admission-webhook/0.log" Mar 16 00:39:59 crc kubenswrapper[4816]: I0316 00:39:59.917104 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-6d8d794bc-6bdkn_36951342-3370-4291-baa3-2612f64036fd/prometheus-operator-admission-webhook/0.log" Mar 16 00:40:00 crc kubenswrapper[4816]: I0316 00:40:00.052990 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_observability-operator-59bdc8b94-w6wv7_8d0f60fa-8d26-43ea-a680-1d3a92dd270d/operator/0.log" Mar 16 00:40:00 crc kubenswrapper[4816]: I0316 00:40:00.086238 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_perses-operator-5bf474d74f-t7w7m_f24959c1-f57f-4bf6-8a55-c8a35173ff8b/perses-operator/0.log" Mar 16 00:40:00 crc kubenswrapper[4816]: I0316 00:40:00.133656 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29560360-m8l8g"] Mar 16 00:40:00 crc kubenswrapper[4816]: E0316 00:40:00.133892 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="35c26e16-0aa9-4252-995c-d34bd14ed1a9" containerName="registry-server" Mar 16 00:40:00 crc kubenswrapper[4816]: I0316 00:40:00.133904 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="35c26e16-0aa9-4252-995c-d34bd14ed1a9" containerName="registry-server" Mar 16 00:40:00 crc kubenswrapper[4816]: I0316 00:40:00.134022 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="35c26e16-0aa9-4252-995c-d34bd14ed1a9" containerName="registry-server" Mar 16 00:40:00 crc kubenswrapper[4816]: I0316 00:40:00.134423 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29560360-m8l8g" Mar 16 00:40:00 crc kubenswrapper[4816]: I0316 00:40:00.139890 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 16 00:40:00 crc kubenswrapper[4816]: I0316 00:40:00.140022 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 16 00:40:00 crc kubenswrapper[4816]: I0316 00:40:00.140087 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-8hc2r" Mar 16 00:40:00 crc kubenswrapper[4816]: I0316 00:40:00.147147 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29560360-m8l8g"] Mar 16 00:40:00 crc kubenswrapper[4816]: I0316 00:40:00.281750 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-phg76\" (UniqueName: \"kubernetes.io/projected/e737de26-7c67-4557-9e7b-67fd7e9d835b-kube-api-access-phg76\") pod \"auto-csr-approver-29560360-m8l8g\" (UID: \"e737de26-7c67-4557-9e7b-67fd7e9d835b\") " pod="openshift-infra/auto-csr-approver-29560360-m8l8g" Mar 16 00:40:00 crc kubenswrapper[4816]: I0316 00:40:00.383677 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-phg76\" (UniqueName: \"kubernetes.io/projected/e737de26-7c67-4557-9e7b-67fd7e9d835b-kube-api-access-phg76\") pod \"auto-csr-approver-29560360-m8l8g\" (UID: \"e737de26-7c67-4557-9e7b-67fd7e9d835b\") " pod="openshift-infra/auto-csr-approver-29560360-m8l8g" Mar 16 00:40:00 crc kubenswrapper[4816]: I0316 00:40:00.405533 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-phg76\" (UniqueName: \"kubernetes.io/projected/e737de26-7c67-4557-9e7b-67fd7e9d835b-kube-api-access-phg76\") pod \"auto-csr-approver-29560360-m8l8g\" (UID: \"e737de26-7c67-4557-9e7b-67fd7e9d835b\") " pod="openshift-infra/auto-csr-approver-29560360-m8l8g" Mar 16 00:40:00 crc kubenswrapper[4816]: I0316 00:40:00.481703 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29560360-m8l8g" Mar 16 00:40:00 crc kubenswrapper[4816]: I0316 00:40:00.896516 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29560360-m8l8g"] Mar 16 00:40:00 crc kubenswrapper[4816]: I0316 00:40:00.907998 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29560360-m8l8g" event={"ID":"e737de26-7c67-4557-9e7b-67fd7e9d835b","Type":"ContainerStarted","Data":"b81fe24230d2398c34926e1bf3e9257add7500c9b552f10b835db83e285f4bdc"} Mar 16 00:40:02 crc kubenswrapper[4816]: I0316 00:40:02.930165 4816 generic.go:334] "Generic (PLEG): container finished" podID="e737de26-7c67-4557-9e7b-67fd7e9d835b" containerID="965ec4e3b522a6a1ef0e1631c35ba6e14d08dbcd684be23e71cc56af8a75f39e" exitCode=0 Mar 16 00:40:02 crc kubenswrapper[4816]: I0316 00:40:02.930451 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29560360-m8l8g" event={"ID":"e737de26-7c67-4557-9e7b-67fd7e9d835b","Type":"ContainerDied","Data":"965ec4e3b522a6a1ef0e1631c35ba6e14d08dbcd684be23e71cc56af8a75f39e"} Mar 16 00:40:04 crc kubenswrapper[4816]: I0316 00:40:04.188034 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29560360-m8l8g" Mar 16 00:40:04 crc kubenswrapper[4816]: I0316 00:40:04.236837 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-phg76\" (UniqueName: \"kubernetes.io/projected/e737de26-7c67-4557-9e7b-67fd7e9d835b-kube-api-access-phg76\") pod \"e737de26-7c67-4557-9e7b-67fd7e9d835b\" (UID: \"e737de26-7c67-4557-9e7b-67fd7e9d835b\") " Mar 16 00:40:04 crc kubenswrapper[4816]: I0316 00:40:04.242320 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e737de26-7c67-4557-9e7b-67fd7e9d835b-kube-api-access-phg76" (OuterVolumeSpecName: "kube-api-access-phg76") pod "e737de26-7c67-4557-9e7b-67fd7e9d835b" (UID: "e737de26-7c67-4557-9e7b-67fd7e9d835b"). InnerVolumeSpecName "kube-api-access-phg76". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:40:04 crc kubenswrapper[4816]: I0316 00:40:04.338921 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-phg76\" (UniqueName: \"kubernetes.io/projected/e737de26-7c67-4557-9e7b-67fd7e9d835b-kube-api-access-phg76\") on node \"crc\" DevicePath \"\"" Mar 16 00:40:04 crc kubenswrapper[4816]: I0316 00:40:04.943917 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29560360-m8l8g" event={"ID":"e737de26-7c67-4557-9e7b-67fd7e9d835b","Type":"ContainerDied","Data":"b81fe24230d2398c34926e1bf3e9257add7500c9b552f10b835db83e285f4bdc"} Mar 16 00:40:04 crc kubenswrapper[4816]: I0316 00:40:04.943962 4816 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b81fe24230d2398c34926e1bf3e9257add7500c9b552f10b835db83e285f4bdc" Mar 16 00:40:04 crc kubenswrapper[4816]: I0316 00:40:04.944025 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29560360-m8l8g" Mar 16 00:40:05 crc kubenswrapper[4816]: I0316 00:40:05.250245 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29560354-nmflm"] Mar 16 00:40:05 crc kubenswrapper[4816]: I0316 00:40:05.256660 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29560354-nmflm"] Mar 16 00:40:05 crc kubenswrapper[4816]: I0316 00:40:05.684340 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4786aa78-4870-43d7-a324-e3e3dd2c7943" path="/var/lib/kubelet/pods/4786aa78-4870-43d7-a324-e3e3dd2c7943/volumes" Mar 16 00:40:07 crc kubenswrapper[4816]: I0316 00:40:07.671947 4816 scope.go:117] "RemoveContainer" containerID="ad2d141490ffe2b1f91c3a2296efeb65e936382d1300bb9398efe83e779ccd8a" Mar 16 00:40:07 crc kubenswrapper[4816]: E0316 00:40:07.672196 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jrdcz_openshift-machine-config-operator(dd08ece2-7636-4966-973a-e96a34b70b53)\"" pod="openshift-machine-config-operator/machine-config-daemon-jrdcz" podUID="dd08ece2-7636-4966-973a-e96a34b70b53" Mar 16 00:40:12 crc kubenswrapper[4816]: I0316 00:40:12.879479 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-vn8mx"] Mar 16 00:40:12 crc kubenswrapper[4816]: E0316 00:40:12.879962 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e737de26-7c67-4557-9e7b-67fd7e9d835b" containerName="oc" Mar 16 00:40:12 crc kubenswrapper[4816]: I0316 00:40:12.879973 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="e737de26-7c67-4557-9e7b-67fd7e9d835b" containerName="oc" Mar 16 00:40:12 crc kubenswrapper[4816]: I0316 00:40:12.880098 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="e737de26-7c67-4557-9e7b-67fd7e9d835b" containerName="oc" Mar 16 00:40:12 crc kubenswrapper[4816]: I0316 00:40:12.880908 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-vn8mx" Mar 16 00:40:12 crc kubenswrapper[4816]: I0316 00:40:12.901908 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-vn8mx"] Mar 16 00:40:12 crc kubenswrapper[4816]: I0316 00:40:12.964178 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4f54343a-e5a5-4a0f-9940-5dabdbea5927-utilities\") pod \"community-operators-vn8mx\" (UID: \"4f54343a-e5a5-4a0f-9940-5dabdbea5927\") " pod="openshift-marketplace/community-operators-vn8mx" Mar 16 00:40:12 crc kubenswrapper[4816]: I0316 00:40:12.964250 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sndqw\" (UniqueName: \"kubernetes.io/projected/4f54343a-e5a5-4a0f-9940-5dabdbea5927-kube-api-access-sndqw\") pod \"community-operators-vn8mx\" (UID: \"4f54343a-e5a5-4a0f-9940-5dabdbea5927\") " pod="openshift-marketplace/community-operators-vn8mx" Mar 16 00:40:12 crc kubenswrapper[4816]: I0316 00:40:12.964300 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4f54343a-e5a5-4a0f-9940-5dabdbea5927-catalog-content\") pod \"community-operators-vn8mx\" (UID: \"4f54343a-e5a5-4a0f-9940-5dabdbea5927\") " pod="openshift-marketplace/community-operators-vn8mx" Mar 16 00:40:13 crc kubenswrapper[4816]: I0316 00:40:13.065267 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4f54343a-e5a5-4a0f-9940-5dabdbea5927-utilities\") pod \"community-operators-vn8mx\" (UID: \"4f54343a-e5a5-4a0f-9940-5dabdbea5927\") " pod="openshift-marketplace/community-operators-vn8mx" Mar 16 00:40:13 crc kubenswrapper[4816]: I0316 00:40:13.065519 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sndqw\" (UniqueName: \"kubernetes.io/projected/4f54343a-e5a5-4a0f-9940-5dabdbea5927-kube-api-access-sndqw\") pod \"community-operators-vn8mx\" (UID: \"4f54343a-e5a5-4a0f-9940-5dabdbea5927\") " pod="openshift-marketplace/community-operators-vn8mx" Mar 16 00:40:13 crc kubenswrapper[4816]: I0316 00:40:13.065647 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4f54343a-e5a5-4a0f-9940-5dabdbea5927-catalog-content\") pod \"community-operators-vn8mx\" (UID: \"4f54343a-e5a5-4a0f-9940-5dabdbea5927\") " pod="openshift-marketplace/community-operators-vn8mx" Mar 16 00:40:13 crc kubenswrapper[4816]: I0316 00:40:13.066156 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4f54343a-e5a5-4a0f-9940-5dabdbea5927-catalog-content\") pod \"community-operators-vn8mx\" (UID: \"4f54343a-e5a5-4a0f-9940-5dabdbea5927\") " pod="openshift-marketplace/community-operators-vn8mx" Mar 16 00:40:13 crc kubenswrapper[4816]: I0316 00:40:13.066426 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4f54343a-e5a5-4a0f-9940-5dabdbea5927-utilities\") pod \"community-operators-vn8mx\" (UID: \"4f54343a-e5a5-4a0f-9940-5dabdbea5927\") " pod="openshift-marketplace/community-operators-vn8mx" Mar 16 00:40:13 crc kubenswrapper[4816]: I0316 00:40:13.085236 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sndqw\" (UniqueName: \"kubernetes.io/projected/4f54343a-e5a5-4a0f-9940-5dabdbea5927-kube-api-access-sndqw\") pod \"community-operators-vn8mx\" (UID: \"4f54343a-e5a5-4a0f-9940-5dabdbea5927\") " pod="openshift-marketplace/community-operators-vn8mx" Mar 16 00:40:13 crc kubenswrapper[4816]: I0316 00:40:13.255366 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-vn8mx" Mar 16 00:40:14 crc kubenswrapper[4816]: I0316 00:40:14.585607 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-vn8mx"] Mar 16 00:40:14 crc kubenswrapper[4816]: I0316 00:40:14.880726 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-rr89x"] Mar 16 00:40:14 crc kubenswrapper[4816]: I0316 00:40:14.882598 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-rr89x" Mar 16 00:40:14 crc kubenswrapper[4816]: I0316 00:40:14.897367 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-rr89x"] Mar 16 00:40:14 crc kubenswrapper[4816]: I0316 00:40:14.993406 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c859eca9-ca62-4c8a-a11d-dac23fafcec5-catalog-content\") pod \"certified-operators-rr89x\" (UID: \"c859eca9-ca62-4c8a-a11d-dac23fafcec5\") " pod="openshift-marketplace/certified-operators-rr89x" Mar 16 00:40:14 crc kubenswrapper[4816]: I0316 00:40:14.993453 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c859eca9-ca62-4c8a-a11d-dac23fafcec5-utilities\") pod \"certified-operators-rr89x\" (UID: \"c859eca9-ca62-4c8a-a11d-dac23fafcec5\") " pod="openshift-marketplace/certified-operators-rr89x" Mar 16 00:40:14 crc kubenswrapper[4816]: I0316 00:40:14.993474 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9d67l\" (UniqueName: \"kubernetes.io/projected/c859eca9-ca62-4c8a-a11d-dac23fafcec5-kube-api-access-9d67l\") pod \"certified-operators-rr89x\" (UID: \"c859eca9-ca62-4c8a-a11d-dac23fafcec5\") " pod="openshift-marketplace/certified-operators-rr89x" Mar 16 00:40:15 crc kubenswrapper[4816]: I0316 00:40:15.059385 4816 generic.go:334] "Generic (PLEG): container finished" podID="4f54343a-e5a5-4a0f-9940-5dabdbea5927" containerID="0234f62b2ab36f0fcff6495931cadcf31214e09994e6a394309fae2df91b1f67" exitCode=0 Mar 16 00:40:15 crc kubenswrapper[4816]: I0316 00:40:15.059484 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vn8mx" event={"ID":"4f54343a-e5a5-4a0f-9940-5dabdbea5927","Type":"ContainerDied","Data":"0234f62b2ab36f0fcff6495931cadcf31214e09994e6a394309fae2df91b1f67"} Mar 16 00:40:15 crc kubenswrapper[4816]: I0316 00:40:15.059750 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vn8mx" event={"ID":"4f54343a-e5a5-4a0f-9940-5dabdbea5927","Type":"ContainerStarted","Data":"7ee26dd0ee79cd708284dffb1050308eefe558df23e3cb8fc337e2dbccfbdb55"} Mar 16 00:40:15 crc kubenswrapper[4816]: I0316 00:40:15.095098 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c859eca9-ca62-4c8a-a11d-dac23fafcec5-catalog-content\") pod \"certified-operators-rr89x\" (UID: \"c859eca9-ca62-4c8a-a11d-dac23fafcec5\") " pod="openshift-marketplace/certified-operators-rr89x" Mar 16 00:40:15 crc kubenswrapper[4816]: I0316 00:40:15.095150 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c859eca9-ca62-4c8a-a11d-dac23fafcec5-utilities\") pod \"certified-operators-rr89x\" (UID: \"c859eca9-ca62-4c8a-a11d-dac23fafcec5\") " pod="openshift-marketplace/certified-operators-rr89x" Mar 16 00:40:15 crc kubenswrapper[4816]: I0316 00:40:15.095187 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9d67l\" (UniqueName: \"kubernetes.io/projected/c859eca9-ca62-4c8a-a11d-dac23fafcec5-kube-api-access-9d67l\") pod \"certified-operators-rr89x\" (UID: \"c859eca9-ca62-4c8a-a11d-dac23fafcec5\") " pod="openshift-marketplace/certified-operators-rr89x" Mar 16 00:40:15 crc kubenswrapper[4816]: I0316 00:40:15.095904 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c859eca9-ca62-4c8a-a11d-dac23fafcec5-utilities\") pod \"certified-operators-rr89x\" (UID: \"c859eca9-ca62-4c8a-a11d-dac23fafcec5\") " pod="openshift-marketplace/certified-operators-rr89x" Mar 16 00:40:15 crc kubenswrapper[4816]: I0316 00:40:15.096241 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c859eca9-ca62-4c8a-a11d-dac23fafcec5-catalog-content\") pod \"certified-operators-rr89x\" (UID: \"c859eca9-ca62-4c8a-a11d-dac23fafcec5\") " pod="openshift-marketplace/certified-operators-rr89x" Mar 16 00:40:15 crc kubenswrapper[4816]: I0316 00:40:15.118865 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9d67l\" (UniqueName: \"kubernetes.io/projected/c859eca9-ca62-4c8a-a11d-dac23fafcec5-kube-api-access-9d67l\") pod \"certified-operators-rr89x\" (UID: \"c859eca9-ca62-4c8a-a11d-dac23fafcec5\") " pod="openshift-marketplace/certified-operators-rr89x" Mar 16 00:40:15 crc kubenswrapper[4816]: I0316 00:40:15.205488 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-rr89x" Mar 16 00:40:15 crc kubenswrapper[4816]: I0316 00:40:15.762550 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-rr89x"] Mar 16 00:40:16 crc kubenswrapper[4816]: I0316 00:40:16.036817 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fhdqd9_43895212-4bba-4d69-b3eb-10f49e771de3/util/0.log" Mar 16 00:40:16 crc kubenswrapper[4816]: I0316 00:40:16.067430 4816 generic.go:334] "Generic (PLEG): container finished" podID="c859eca9-ca62-4c8a-a11d-dac23fafcec5" containerID="8f93df4c6d7e443e6258a1faadd00d501b3f70b30052f1330cfb00dfaabbe7d6" exitCode=0 Mar 16 00:40:16 crc kubenswrapper[4816]: I0316 00:40:16.067490 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rr89x" event={"ID":"c859eca9-ca62-4c8a-a11d-dac23fafcec5","Type":"ContainerDied","Data":"8f93df4c6d7e443e6258a1faadd00d501b3f70b30052f1330cfb00dfaabbe7d6"} Mar 16 00:40:16 crc kubenswrapper[4816]: I0316 00:40:16.067517 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rr89x" event={"ID":"c859eca9-ca62-4c8a-a11d-dac23fafcec5","Type":"ContainerStarted","Data":"725b71ee7bad24f50af2dd8bfdd56c09f41a8531b097fc153f7ad1c3c22a44fd"} Mar 16 00:40:16 crc kubenswrapper[4816]: I0316 00:40:16.069542 4816 generic.go:334] "Generic (PLEG): container finished" podID="4f54343a-e5a5-4a0f-9940-5dabdbea5927" containerID="574e1468950e2da4a0e7dd1e555435fcf041931b0ea91ff0fc260bfb5bf229eb" exitCode=0 Mar 16 00:40:16 crc kubenswrapper[4816]: I0316 00:40:16.069593 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vn8mx" event={"ID":"4f54343a-e5a5-4a0f-9940-5dabdbea5927","Type":"ContainerDied","Data":"574e1468950e2da4a0e7dd1e555435fcf041931b0ea91ff0fc260bfb5bf229eb"} Mar 16 00:40:16 crc kubenswrapper[4816]: I0316 00:40:16.197621 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fhdqd9_43895212-4bba-4d69-b3eb-10f49e771de3/pull/0.log" Mar 16 00:40:16 crc kubenswrapper[4816]: I0316 00:40:16.227696 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fhdqd9_43895212-4bba-4d69-b3eb-10f49e771de3/util/0.log" Mar 16 00:40:16 crc kubenswrapper[4816]: I0316 00:40:16.239459 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fhdqd9_43895212-4bba-4d69-b3eb-10f49e771de3/pull/0.log" Mar 16 00:40:16 crc kubenswrapper[4816]: I0316 00:40:16.383094 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fhdqd9_43895212-4bba-4d69-b3eb-10f49e771de3/pull/0.log" Mar 16 00:40:16 crc kubenswrapper[4816]: I0316 00:40:16.386610 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fhdqd9_43895212-4bba-4d69-b3eb-10f49e771de3/util/0.log" Mar 16 00:40:16 crc kubenswrapper[4816]: I0316 00:40:16.401380 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fhdqd9_43895212-4bba-4d69-b3eb-10f49e771de3/extract/0.log" Mar 16 00:40:16 crc kubenswrapper[4816]: I0316 00:40:16.595095 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39exnc6z_3e8bb6e1-f8fd-4484-ba21-a2d5f80f0d1c/util/0.log" Mar 16 00:40:16 crc kubenswrapper[4816]: I0316 00:40:16.751233 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39exnc6z_3e8bb6e1-f8fd-4484-ba21-a2d5f80f0d1c/pull/0.log" Mar 16 00:40:16 crc kubenswrapper[4816]: I0316 00:40:16.764618 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39exnc6z_3e8bb6e1-f8fd-4484-ba21-a2d5f80f0d1c/util/0.log" Mar 16 00:40:16 crc kubenswrapper[4816]: I0316 00:40:16.783101 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39exnc6z_3e8bb6e1-f8fd-4484-ba21-a2d5f80f0d1c/pull/0.log" Mar 16 00:40:17 crc kubenswrapper[4816]: I0316 00:40:17.012109 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39exnc6z_3e8bb6e1-f8fd-4484-ba21-a2d5f80f0d1c/pull/0.log" Mar 16 00:40:17 crc kubenswrapper[4816]: I0316 00:40:17.028440 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39exnc6z_3e8bb6e1-f8fd-4484-ba21-a2d5f80f0d1c/extract/0.log" Mar 16 00:40:17 crc kubenswrapper[4816]: I0316 00:40:17.048118 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39exnc6z_3e8bb6e1-f8fd-4484-ba21-a2d5f80f0d1c/util/0.log" Mar 16 00:40:17 crc kubenswrapper[4816]: I0316 00:40:17.079059 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vn8mx" event={"ID":"4f54343a-e5a5-4a0f-9940-5dabdbea5927","Type":"ContainerStarted","Data":"5a05b74fb378721c8ab3e182d2dc627e6e98513d531aa144da324b46cbf22795"} Mar 16 00:40:17 crc kubenswrapper[4816]: I0316 00:40:17.097401 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-vn8mx" podStartSLOduration=3.623918883 podStartE2EDuration="5.097386362s" podCreationTimestamp="2026-03-16 00:40:12 +0000 UTC" firstStartedPulling="2026-03-16 00:40:15.061109 +0000 UTC m=+2008.157408953" lastFinishedPulling="2026-03-16 00:40:16.534576469 +0000 UTC m=+2009.630876432" observedRunningTime="2026-03-16 00:40:17.095313573 +0000 UTC m=+2010.191613526" watchObservedRunningTime="2026-03-16 00:40:17.097386362 +0000 UTC m=+2010.193686315" Mar 16 00:40:17 crc kubenswrapper[4816]: I0316 00:40:17.238104 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e59v87l_1da45fda-a8cc-46c1-8831-58418ecc9819/util/0.log" Mar 16 00:40:17 crc kubenswrapper[4816]: I0316 00:40:17.396691 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e59v87l_1da45fda-a8cc-46c1-8831-58418ecc9819/util/0.log" Mar 16 00:40:17 crc kubenswrapper[4816]: I0316 00:40:17.423579 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e59v87l_1da45fda-a8cc-46c1-8831-58418ecc9819/pull/0.log" Mar 16 00:40:17 crc kubenswrapper[4816]: I0316 00:40:17.431157 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e59v87l_1da45fda-a8cc-46c1-8831-58418ecc9819/pull/0.log" Mar 16 00:40:17 crc kubenswrapper[4816]: I0316 00:40:17.586286 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e59v87l_1da45fda-a8cc-46c1-8831-58418ecc9819/util/0.log" Mar 16 00:40:17 crc kubenswrapper[4816]: I0316 00:40:17.614707 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e59v87l_1da45fda-a8cc-46c1-8831-58418ecc9819/extract/0.log" Mar 16 00:40:17 crc kubenswrapper[4816]: I0316 00:40:17.653193 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e59v87l_1da45fda-a8cc-46c1-8831-58418ecc9819/pull/0.log" Mar 16 00:40:17 crc kubenswrapper[4816]: I0316 00:40:17.795898 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f085xqb4_35d36436-ca87-48ef-9a68-484c2335bb33/util/0.log" Mar 16 00:40:17 crc kubenswrapper[4816]: I0316 00:40:17.928178 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f085xqb4_35d36436-ca87-48ef-9a68-484c2335bb33/util/0.log" Mar 16 00:40:17 crc kubenswrapper[4816]: I0316 00:40:17.990652 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f085xqb4_35d36436-ca87-48ef-9a68-484c2335bb33/pull/0.log" Mar 16 00:40:18 crc kubenswrapper[4816]: I0316 00:40:18.054577 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f085xqb4_35d36436-ca87-48ef-9a68-484c2335bb33/pull/0.log" Mar 16 00:40:18 crc kubenswrapper[4816]: I0316 00:40:18.196499 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f085xqb4_35d36436-ca87-48ef-9a68-484c2335bb33/extract/0.log" Mar 16 00:40:18 crc kubenswrapper[4816]: I0316 00:40:18.248737 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f085xqb4_35d36436-ca87-48ef-9a68-484c2335bb33/util/0.log" Mar 16 00:40:18 crc kubenswrapper[4816]: I0316 00:40:18.250611 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f085xqb4_35d36436-ca87-48ef-9a68-484c2335bb33/pull/0.log" Mar 16 00:40:18 crc kubenswrapper[4816]: I0316 00:40:18.387914 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-6z2gx_9d1b1f79-de52-4ade-9a72-69b86c55e8ff/extract-utilities/0.log" Mar 16 00:40:18 crc kubenswrapper[4816]: I0316 00:40:18.624982 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-6z2gx_9d1b1f79-de52-4ade-9a72-69b86c55e8ff/extract-content/0.log" Mar 16 00:40:18 crc kubenswrapper[4816]: I0316 00:40:18.632108 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-6z2gx_9d1b1f79-de52-4ade-9a72-69b86c55e8ff/extract-content/0.log" Mar 16 00:40:18 crc kubenswrapper[4816]: I0316 00:40:18.649317 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-6z2gx_9d1b1f79-de52-4ade-9a72-69b86c55e8ff/extract-utilities/0.log" Mar 16 00:40:18 crc kubenswrapper[4816]: I0316 00:40:18.829662 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-6z2gx_9d1b1f79-de52-4ade-9a72-69b86c55e8ff/extract-utilities/0.log" Mar 16 00:40:18 crc kubenswrapper[4816]: I0316 00:40:18.866064 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-6z2gx_9d1b1f79-de52-4ade-9a72-69b86c55e8ff/extract-content/0.log" Mar 16 00:40:19 crc kubenswrapper[4816]: I0316 00:40:19.017958 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-rr89x_c859eca9-ca62-4c8a-a11d-dac23fafcec5/extract-utilities/0.log" Mar 16 00:40:19 crc kubenswrapper[4816]: I0316 00:40:19.149080 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-6z2gx_9d1b1f79-de52-4ade-9a72-69b86c55e8ff/registry-server/0.log" Mar 16 00:40:19 crc kubenswrapper[4816]: I0316 00:40:19.199974 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-rr89x_c859eca9-ca62-4c8a-a11d-dac23fafcec5/extract-utilities/0.log" Mar 16 00:40:19 crc kubenswrapper[4816]: I0316 00:40:19.399543 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-rr89x_c859eca9-ca62-4c8a-a11d-dac23fafcec5/extract-utilities/0.log" Mar 16 00:40:19 crc kubenswrapper[4816]: I0316 00:40:19.505173 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-8jcgw_249ae30f-a698-43f3-9464-24868dff2ad6/extract-utilities/0.log" Mar 16 00:40:19 crc kubenswrapper[4816]: I0316 00:40:19.642049 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-8jcgw_249ae30f-a698-43f3-9464-24868dff2ad6/extract-content/0.log" Mar 16 00:40:19 crc kubenswrapper[4816]: I0316 00:40:19.646487 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-8jcgw_249ae30f-a698-43f3-9464-24868dff2ad6/extract-utilities/0.log" Mar 16 00:40:19 crc kubenswrapper[4816]: I0316 00:40:19.671158 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-8jcgw_249ae30f-a698-43f3-9464-24868dff2ad6/extract-content/0.log" Mar 16 00:40:19 crc kubenswrapper[4816]: I0316 00:40:19.822609 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-8jcgw_249ae30f-a698-43f3-9464-24868dff2ad6/extract-content/0.log" Mar 16 00:40:19 crc kubenswrapper[4816]: I0316 00:40:19.900085 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-vn8mx_4f54343a-e5a5-4a0f-9940-5dabdbea5927/extract-utilities/0.log" Mar 16 00:40:19 crc kubenswrapper[4816]: I0316 00:40:19.920917 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-8jcgw_249ae30f-a698-43f3-9464-24868dff2ad6/extract-utilities/0.log" Mar 16 00:40:20 crc kubenswrapper[4816]: I0316 00:40:20.123038 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-vn8mx_4f54343a-e5a5-4a0f-9940-5dabdbea5927/extract-utilities/0.log" Mar 16 00:40:20 crc kubenswrapper[4816]: I0316 00:40:20.149898 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-vn8mx_4f54343a-e5a5-4a0f-9940-5dabdbea5927/extract-content/0.log" Mar 16 00:40:20 crc kubenswrapper[4816]: I0316 00:40:20.162072 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-vn8mx_4f54343a-e5a5-4a0f-9940-5dabdbea5927/extract-content/0.log" Mar 16 00:40:20 crc kubenswrapper[4816]: I0316 00:40:20.338412 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-vn8mx_4f54343a-e5a5-4a0f-9940-5dabdbea5927/registry-server/0.log" Mar 16 00:40:20 crc kubenswrapper[4816]: I0316 00:40:20.357848 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-vn8mx_4f54343a-e5a5-4a0f-9940-5dabdbea5927/extract-content/0.log" Mar 16 00:40:20 crc kubenswrapper[4816]: I0316 00:40:20.394238 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-8jcgw_249ae30f-a698-43f3-9464-24868dff2ad6/registry-server/0.log" Mar 16 00:40:20 crc kubenswrapper[4816]: I0316 00:40:20.407181 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-vn8mx_4f54343a-e5a5-4a0f-9940-5dabdbea5927/extract-utilities/0.log" Mar 16 00:40:20 crc kubenswrapper[4816]: I0316 00:40:20.552725 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-hmzx7_6df1dc3a-6abd-4ffc-b27b-e66f281ed273/extract-utilities/0.log" Mar 16 00:40:20 crc kubenswrapper[4816]: I0316 00:40:20.570191 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-8ln7g_6d197f63-0b7c-496d-89bb-9cd70933969a/marketplace-operator/0.log" Mar 16 00:40:20 crc kubenswrapper[4816]: I0316 00:40:20.736597 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-hmzx7_6df1dc3a-6abd-4ffc-b27b-e66f281ed273/extract-content/0.log" Mar 16 00:40:20 crc kubenswrapper[4816]: I0316 00:40:20.736850 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-hmzx7_6df1dc3a-6abd-4ffc-b27b-e66f281ed273/extract-utilities/0.log" Mar 16 00:40:20 crc kubenswrapper[4816]: I0316 00:40:20.757202 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-hmzx7_6df1dc3a-6abd-4ffc-b27b-e66f281ed273/extract-content/0.log" Mar 16 00:40:20 crc kubenswrapper[4816]: I0316 00:40:20.917981 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-hmzx7_6df1dc3a-6abd-4ffc-b27b-e66f281ed273/extract-utilities/0.log" Mar 16 00:40:20 crc kubenswrapper[4816]: I0316 00:40:20.939068 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-hmzx7_6df1dc3a-6abd-4ffc-b27b-e66f281ed273/extract-content/0.log" Mar 16 00:40:21 crc kubenswrapper[4816]: I0316 00:40:21.213110 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-hmzx7_6df1dc3a-6abd-4ffc-b27b-e66f281ed273/registry-server/0.log" Mar 16 00:40:21 crc kubenswrapper[4816]: I0316 00:40:21.667620 4816 scope.go:117] "RemoveContainer" containerID="ad2d141490ffe2b1f91c3a2296efeb65e936382d1300bb9398efe83e779ccd8a" Mar 16 00:40:21 crc kubenswrapper[4816]: E0316 00:40:21.667854 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jrdcz_openshift-machine-config-operator(dd08ece2-7636-4966-973a-e96a34b70b53)\"" pod="openshift-machine-config-operator/machine-config-daemon-jrdcz" podUID="dd08ece2-7636-4966-973a-e96a34b70b53" Mar 16 00:40:23 crc kubenswrapper[4816]: I0316 00:40:23.127696 4816 generic.go:334] "Generic (PLEG): container finished" podID="c859eca9-ca62-4c8a-a11d-dac23fafcec5" containerID="5f2c0db44c076ad7e9a7e236b9040381aff8d488da201ce9ed1de55f218946d9" exitCode=0 Mar 16 00:40:23 crc kubenswrapper[4816]: I0316 00:40:23.127835 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rr89x" event={"ID":"c859eca9-ca62-4c8a-a11d-dac23fafcec5","Type":"ContainerDied","Data":"5f2c0db44c076ad7e9a7e236b9040381aff8d488da201ce9ed1de55f218946d9"} Mar 16 00:40:23 crc kubenswrapper[4816]: I0316 00:40:23.255776 4816 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-vn8mx" Mar 16 00:40:23 crc kubenswrapper[4816]: I0316 00:40:23.255865 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-vn8mx" Mar 16 00:40:23 crc kubenswrapper[4816]: I0316 00:40:23.322700 4816 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-vn8mx" Mar 16 00:40:24 crc kubenswrapper[4816]: I0316 00:40:24.138152 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rr89x" event={"ID":"c859eca9-ca62-4c8a-a11d-dac23fafcec5","Type":"ContainerStarted","Data":"6d6395fd6fd2c13b0749094b0edc67b382632e3df50267fec6f9cc1310773f3c"} Mar 16 00:40:24 crc kubenswrapper[4816]: I0316 00:40:24.167145 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-rr89x" podStartSLOduration=2.720901708 podStartE2EDuration="10.167120569s" podCreationTimestamp="2026-03-16 00:40:14 +0000 UTC" firstStartedPulling="2026-03-16 00:40:16.06950614 +0000 UTC m=+2009.165806093" lastFinishedPulling="2026-03-16 00:40:23.515724961 +0000 UTC m=+2016.612024954" observedRunningTime="2026-03-16 00:40:24.159098262 +0000 UTC m=+2017.255398225" watchObservedRunningTime="2026-03-16 00:40:24.167120569 +0000 UTC m=+2017.263420522" Mar 16 00:40:24 crc kubenswrapper[4816]: I0316 00:40:24.202454 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-vn8mx" Mar 16 00:40:25 crc kubenswrapper[4816]: I0316 00:40:25.205937 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-rr89x" Mar 16 00:40:25 crc kubenswrapper[4816]: I0316 00:40:25.206180 4816 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-rr89x" Mar 16 00:40:25 crc kubenswrapper[4816]: I0316 00:40:25.373464 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-vn8mx"] Mar 16 00:40:26 crc kubenswrapper[4816]: I0316 00:40:26.155797 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-vn8mx" podUID="4f54343a-e5a5-4a0f-9940-5dabdbea5927" containerName="registry-server" containerID="cri-o://5a05b74fb378721c8ab3e182d2dc627e6e98513d531aa144da324b46cbf22795" gracePeriod=2 Mar 16 00:40:26 crc kubenswrapper[4816]: I0316 00:40:26.260029 4816 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-rr89x" podUID="c859eca9-ca62-4c8a-a11d-dac23fafcec5" containerName="registry-server" probeResult="failure" output=< Mar 16 00:40:26 crc kubenswrapper[4816]: timeout: failed to connect service ":50051" within 1s Mar 16 00:40:26 crc kubenswrapper[4816]: > Mar 16 00:40:26 crc kubenswrapper[4816]: I0316 00:40:26.526327 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-vn8mx" Mar 16 00:40:26 crc kubenswrapper[4816]: I0316 00:40:26.671804 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sndqw\" (UniqueName: \"kubernetes.io/projected/4f54343a-e5a5-4a0f-9940-5dabdbea5927-kube-api-access-sndqw\") pod \"4f54343a-e5a5-4a0f-9940-5dabdbea5927\" (UID: \"4f54343a-e5a5-4a0f-9940-5dabdbea5927\") " Mar 16 00:40:26 crc kubenswrapper[4816]: I0316 00:40:26.671943 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4f54343a-e5a5-4a0f-9940-5dabdbea5927-catalog-content\") pod \"4f54343a-e5a5-4a0f-9940-5dabdbea5927\" (UID: \"4f54343a-e5a5-4a0f-9940-5dabdbea5927\") " Mar 16 00:40:26 crc kubenswrapper[4816]: I0316 00:40:26.672024 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4f54343a-e5a5-4a0f-9940-5dabdbea5927-utilities\") pod \"4f54343a-e5a5-4a0f-9940-5dabdbea5927\" (UID: \"4f54343a-e5a5-4a0f-9940-5dabdbea5927\") " Mar 16 00:40:26 crc kubenswrapper[4816]: I0316 00:40:26.672742 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4f54343a-e5a5-4a0f-9940-5dabdbea5927-utilities" (OuterVolumeSpecName: "utilities") pod "4f54343a-e5a5-4a0f-9940-5dabdbea5927" (UID: "4f54343a-e5a5-4a0f-9940-5dabdbea5927"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 16 00:40:26 crc kubenswrapper[4816]: I0316 00:40:26.677685 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4f54343a-e5a5-4a0f-9940-5dabdbea5927-kube-api-access-sndqw" (OuterVolumeSpecName: "kube-api-access-sndqw") pod "4f54343a-e5a5-4a0f-9940-5dabdbea5927" (UID: "4f54343a-e5a5-4a0f-9940-5dabdbea5927"). InnerVolumeSpecName "kube-api-access-sndqw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:40:26 crc kubenswrapper[4816]: I0316 00:40:26.720992 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4f54343a-e5a5-4a0f-9940-5dabdbea5927-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "4f54343a-e5a5-4a0f-9940-5dabdbea5927" (UID: "4f54343a-e5a5-4a0f-9940-5dabdbea5927"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 16 00:40:26 crc kubenswrapper[4816]: I0316 00:40:26.773405 4816 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4f54343a-e5a5-4a0f-9940-5dabdbea5927-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 16 00:40:26 crc kubenswrapper[4816]: I0316 00:40:26.773447 4816 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4f54343a-e5a5-4a0f-9940-5dabdbea5927-utilities\") on node \"crc\" DevicePath \"\"" Mar 16 00:40:26 crc kubenswrapper[4816]: I0316 00:40:26.773460 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sndqw\" (UniqueName: \"kubernetes.io/projected/4f54343a-e5a5-4a0f-9940-5dabdbea5927-kube-api-access-sndqw\") on node \"crc\" DevicePath \"\"" Mar 16 00:40:27 crc kubenswrapper[4816]: I0316 00:40:27.172547 4816 generic.go:334] "Generic (PLEG): container finished" podID="4f54343a-e5a5-4a0f-9940-5dabdbea5927" containerID="5a05b74fb378721c8ab3e182d2dc627e6e98513d531aa144da324b46cbf22795" exitCode=0 Mar 16 00:40:27 crc kubenswrapper[4816]: I0316 00:40:27.172747 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vn8mx" event={"ID":"4f54343a-e5a5-4a0f-9940-5dabdbea5927","Type":"ContainerDied","Data":"5a05b74fb378721c8ab3e182d2dc627e6e98513d531aa144da324b46cbf22795"} Mar 16 00:40:27 crc kubenswrapper[4816]: I0316 00:40:27.172993 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vn8mx" event={"ID":"4f54343a-e5a5-4a0f-9940-5dabdbea5927","Type":"ContainerDied","Data":"7ee26dd0ee79cd708284dffb1050308eefe558df23e3cb8fc337e2dbccfbdb55"} Mar 16 00:40:27 crc kubenswrapper[4816]: I0316 00:40:27.172864 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-vn8mx" Mar 16 00:40:27 crc kubenswrapper[4816]: I0316 00:40:27.173092 4816 scope.go:117] "RemoveContainer" containerID="5a05b74fb378721c8ab3e182d2dc627e6e98513d531aa144da324b46cbf22795" Mar 16 00:40:27 crc kubenswrapper[4816]: I0316 00:40:27.195768 4816 scope.go:117] "RemoveContainer" containerID="574e1468950e2da4a0e7dd1e555435fcf041931b0ea91ff0fc260bfb5bf229eb" Mar 16 00:40:27 crc kubenswrapper[4816]: I0316 00:40:27.205306 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-vn8mx"] Mar 16 00:40:27 crc kubenswrapper[4816]: I0316 00:40:27.212544 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-vn8mx"] Mar 16 00:40:27 crc kubenswrapper[4816]: I0316 00:40:27.214669 4816 scope.go:117] "RemoveContainer" containerID="0234f62b2ab36f0fcff6495931cadcf31214e09994e6a394309fae2df91b1f67" Mar 16 00:40:27 crc kubenswrapper[4816]: I0316 00:40:27.243309 4816 scope.go:117] "RemoveContainer" containerID="5a05b74fb378721c8ab3e182d2dc627e6e98513d531aa144da324b46cbf22795" Mar 16 00:40:27 crc kubenswrapper[4816]: E0316 00:40:27.244016 4816 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5a05b74fb378721c8ab3e182d2dc627e6e98513d531aa144da324b46cbf22795\": container with ID starting with 5a05b74fb378721c8ab3e182d2dc627e6e98513d531aa144da324b46cbf22795 not found: ID does not exist" containerID="5a05b74fb378721c8ab3e182d2dc627e6e98513d531aa144da324b46cbf22795" Mar 16 00:40:27 crc kubenswrapper[4816]: I0316 00:40:27.244059 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5a05b74fb378721c8ab3e182d2dc627e6e98513d531aa144da324b46cbf22795"} err="failed to get container status \"5a05b74fb378721c8ab3e182d2dc627e6e98513d531aa144da324b46cbf22795\": rpc error: code = NotFound desc = could not find container \"5a05b74fb378721c8ab3e182d2dc627e6e98513d531aa144da324b46cbf22795\": container with ID starting with 5a05b74fb378721c8ab3e182d2dc627e6e98513d531aa144da324b46cbf22795 not found: ID does not exist" Mar 16 00:40:27 crc kubenswrapper[4816]: I0316 00:40:27.244088 4816 scope.go:117] "RemoveContainer" containerID="574e1468950e2da4a0e7dd1e555435fcf041931b0ea91ff0fc260bfb5bf229eb" Mar 16 00:40:27 crc kubenswrapper[4816]: E0316 00:40:27.244494 4816 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"574e1468950e2da4a0e7dd1e555435fcf041931b0ea91ff0fc260bfb5bf229eb\": container with ID starting with 574e1468950e2da4a0e7dd1e555435fcf041931b0ea91ff0fc260bfb5bf229eb not found: ID does not exist" containerID="574e1468950e2da4a0e7dd1e555435fcf041931b0ea91ff0fc260bfb5bf229eb" Mar 16 00:40:27 crc kubenswrapper[4816]: I0316 00:40:27.244534 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"574e1468950e2da4a0e7dd1e555435fcf041931b0ea91ff0fc260bfb5bf229eb"} err="failed to get container status \"574e1468950e2da4a0e7dd1e555435fcf041931b0ea91ff0fc260bfb5bf229eb\": rpc error: code = NotFound desc = could not find container \"574e1468950e2da4a0e7dd1e555435fcf041931b0ea91ff0fc260bfb5bf229eb\": container with ID starting with 574e1468950e2da4a0e7dd1e555435fcf041931b0ea91ff0fc260bfb5bf229eb not found: ID does not exist" Mar 16 00:40:27 crc kubenswrapper[4816]: I0316 00:40:27.244578 4816 scope.go:117] "RemoveContainer" containerID="0234f62b2ab36f0fcff6495931cadcf31214e09994e6a394309fae2df91b1f67" Mar 16 00:40:27 crc kubenswrapper[4816]: E0316 00:40:27.244856 4816 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0234f62b2ab36f0fcff6495931cadcf31214e09994e6a394309fae2df91b1f67\": container with ID starting with 0234f62b2ab36f0fcff6495931cadcf31214e09994e6a394309fae2df91b1f67 not found: ID does not exist" containerID="0234f62b2ab36f0fcff6495931cadcf31214e09994e6a394309fae2df91b1f67" Mar 16 00:40:27 crc kubenswrapper[4816]: I0316 00:40:27.244897 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0234f62b2ab36f0fcff6495931cadcf31214e09994e6a394309fae2df91b1f67"} err="failed to get container status \"0234f62b2ab36f0fcff6495931cadcf31214e09994e6a394309fae2df91b1f67\": rpc error: code = NotFound desc = could not find container \"0234f62b2ab36f0fcff6495931cadcf31214e09994e6a394309fae2df91b1f67\": container with ID starting with 0234f62b2ab36f0fcff6495931cadcf31214e09994e6a394309fae2df91b1f67 not found: ID does not exist" Mar 16 00:40:27 crc kubenswrapper[4816]: I0316 00:40:27.686515 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4f54343a-e5a5-4a0f-9940-5dabdbea5927" path="/var/lib/kubelet/pods/4f54343a-e5a5-4a0f-9940-5dabdbea5927/volumes" Mar 16 00:40:32 crc kubenswrapper[4816]: I0316 00:40:32.668044 4816 scope.go:117] "RemoveContainer" containerID="ad2d141490ffe2b1f91c3a2296efeb65e936382d1300bb9398efe83e779ccd8a" Mar 16 00:40:32 crc kubenswrapper[4816]: E0316 00:40:32.668495 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jrdcz_openshift-machine-config-operator(dd08ece2-7636-4966-973a-e96a34b70b53)\"" pod="openshift-machine-config-operator/machine-config-daemon-jrdcz" podUID="dd08ece2-7636-4966-973a-e96a34b70b53" Mar 16 00:40:34 crc kubenswrapper[4816]: I0316 00:40:34.190124 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-6d8d794bc-2s9sk_9a808114-3164-4abe-a481-1b5d3b9df2a0/prometheus-operator-admission-webhook/0.log" Mar 16 00:40:34 crc kubenswrapper[4816]: I0316 00:40:34.216327 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-6d8d794bc-6bdkn_36951342-3370-4291-baa3-2612f64036fd/prometheus-operator-admission-webhook/0.log" Mar 16 00:40:34 crc kubenswrapper[4816]: I0316 00:40:34.220477 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-68bc856cb9-tfv44_562f24fe-5c4c-4540-96ae-6e01f539141b/prometheus-operator/0.log" Mar 16 00:40:34 crc kubenswrapper[4816]: I0316 00:40:34.321188 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_observability-operator-59bdc8b94-w6wv7_8d0f60fa-8d26-43ea-a680-1d3a92dd270d/operator/0.log" Mar 16 00:40:34 crc kubenswrapper[4816]: I0316 00:40:34.378610 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_perses-operator-5bf474d74f-t7w7m_f24959c1-f57f-4bf6-8a55-c8a35173ff8b/perses-operator/0.log" Mar 16 00:40:35 crc kubenswrapper[4816]: I0316 00:40:35.261672 4816 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-rr89x" Mar 16 00:40:35 crc kubenswrapper[4816]: I0316 00:40:35.317203 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-rr89x" Mar 16 00:40:35 crc kubenswrapper[4816]: I0316 00:40:35.392283 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-rr89x"] Mar 16 00:40:35 crc kubenswrapper[4816]: I0316 00:40:35.492697 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-6z2gx"] Mar 16 00:40:35 crc kubenswrapper[4816]: I0316 00:40:35.492976 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-6z2gx" podUID="9d1b1f79-de52-4ade-9a72-69b86c55e8ff" containerName="registry-server" containerID="cri-o://700492b2ebbd124fe994672d95f3a4f2989857c4185a74f7530dda83e9156f15" gracePeriod=2 Mar 16 00:40:36 crc kubenswrapper[4816]: I0316 00:40:36.233227 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-6z2gx" Mar 16 00:40:36 crc kubenswrapper[4816]: I0316 00:40:36.245229 4816 generic.go:334] "Generic (PLEG): container finished" podID="9d1b1f79-de52-4ade-9a72-69b86c55e8ff" containerID="700492b2ebbd124fe994672d95f3a4f2989857c4185a74f7530dda83e9156f15" exitCode=0 Mar 16 00:40:36 crc kubenswrapper[4816]: I0316 00:40:36.245278 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6z2gx" event={"ID":"9d1b1f79-de52-4ade-9a72-69b86c55e8ff","Type":"ContainerDied","Data":"700492b2ebbd124fe994672d95f3a4f2989857c4185a74f7530dda83e9156f15"} Mar 16 00:40:36 crc kubenswrapper[4816]: I0316 00:40:36.245320 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-6z2gx" Mar 16 00:40:36 crc kubenswrapper[4816]: I0316 00:40:36.245347 4816 scope.go:117] "RemoveContainer" containerID="700492b2ebbd124fe994672d95f3a4f2989857c4185a74f7530dda83e9156f15" Mar 16 00:40:36 crc kubenswrapper[4816]: I0316 00:40:36.245333 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6z2gx" event={"ID":"9d1b1f79-de52-4ade-9a72-69b86c55e8ff","Type":"ContainerDied","Data":"d6634923f047727775df02d4d821820bfe16c08bbd5f740d3677d67d9b993223"} Mar 16 00:40:36 crc kubenswrapper[4816]: I0316 00:40:36.268580 4816 scope.go:117] "RemoveContainer" containerID="083bb242e6bde224d00d304d1f2d497cdabe3bfd93e6548b23a50066c9a1dbce" Mar 16 00:40:36 crc kubenswrapper[4816]: I0316 00:40:36.286833 4816 scope.go:117] "RemoveContainer" containerID="5404e0e23149a051c63ece5c9f7e35a40c4b4a2c3bd255f946675c4984f1d005" Mar 16 00:40:36 crc kubenswrapper[4816]: I0316 00:40:36.312015 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9d1b1f79-de52-4ade-9a72-69b86c55e8ff-catalog-content\") pod \"9d1b1f79-de52-4ade-9a72-69b86c55e8ff\" (UID: \"9d1b1f79-de52-4ade-9a72-69b86c55e8ff\") " Mar 16 00:40:36 crc kubenswrapper[4816]: I0316 00:40:36.312085 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9d1b1f79-de52-4ade-9a72-69b86c55e8ff-utilities\") pod \"9d1b1f79-de52-4ade-9a72-69b86c55e8ff\" (UID: \"9d1b1f79-de52-4ade-9a72-69b86c55e8ff\") " Mar 16 00:40:36 crc kubenswrapper[4816]: I0316 00:40:36.312163 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8pfdf\" (UniqueName: \"kubernetes.io/projected/9d1b1f79-de52-4ade-9a72-69b86c55e8ff-kube-api-access-8pfdf\") pod \"9d1b1f79-de52-4ade-9a72-69b86c55e8ff\" (UID: \"9d1b1f79-de52-4ade-9a72-69b86c55e8ff\") " Mar 16 00:40:36 crc kubenswrapper[4816]: I0316 00:40:36.313058 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9d1b1f79-de52-4ade-9a72-69b86c55e8ff-utilities" (OuterVolumeSpecName: "utilities") pod "9d1b1f79-de52-4ade-9a72-69b86c55e8ff" (UID: "9d1b1f79-de52-4ade-9a72-69b86c55e8ff"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 16 00:40:36 crc kubenswrapper[4816]: I0316 00:40:36.318806 4816 scope.go:117] "RemoveContainer" containerID="700492b2ebbd124fe994672d95f3a4f2989857c4185a74f7530dda83e9156f15" Mar 16 00:40:36 crc kubenswrapper[4816]: I0316 00:40:36.323056 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d1b1f79-de52-4ade-9a72-69b86c55e8ff-kube-api-access-8pfdf" (OuterVolumeSpecName: "kube-api-access-8pfdf") pod "9d1b1f79-de52-4ade-9a72-69b86c55e8ff" (UID: "9d1b1f79-de52-4ade-9a72-69b86c55e8ff"). InnerVolumeSpecName "kube-api-access-8pfdf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:40:36 crc kubenswrapper[4816]: E0316 00:40:36.326731 4816 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"700492b2ebbd124fe994672d95f3a4f2989857c4185a74f7530dda83e9156f15\": container with ID starting with 700492b2ebbd124fe994672d95f3a4f2989857c4185a74f7530dda83e9156f15 not found: ID does not exist" containerID="700492b2ebbd124fe994672d95f3a4f2989857c4185a74f7530dda83e9156f15" Mar 16 00:40:36 crc kubenswrapper[4816]: I0316 00:40:36.326791 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"700492b2ebbd124fe994672d95f3a4f2989857c4185a74f7530dda83e9156f15"} err="failed to get container status \"700492b2ebbd124fe994672d95f3a4f2989857c4185a74f7530dda83e9156f15\": rpc error: code = NotFound desc = could not find container \"700492b2ebbd124fe994672d95f3a4f2989857c4185a74f7530dda83e9156f15\": container with ID starting with 700492b2ebbd124fe994672d95f3a4f2989857c4185a74f7530dda83e9156f15 not found: ID does not exist" Mar 16 00:40:36 crc kubenswrapper[4816]: I0316 00:40:36.326820 4816 scope.go:117] "RemoveContainer" containerID="083bb242e6bde224d00d304d1f2d497cdabe3bfd93e6548b23a50066c9a1dbce" Mar 16 00:40:36 crc kubenswrapper[4816]: E0316 00:40:36.327392 4816 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"083bb242e6bde224d00d304d1f2d497cdabe3bfd93e6548b23a50066c9a1dbce\": container with ID starting with 083bb242e6bde224d00d304d1f2d497cdabe3bfd93e6548b23a50066c9a1dbce not found: ID does not exist" containerID="083bb242e6bde224d00d304d1f2d497cdabe3bfd93e6548b23a50066c9a1dbce" Mar 16 00:40:36 crc kubenswrapper[4816]: I0316 00:40:36.327434 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"083bb242e6bde224d00d304d1f2d497cdabe3bfd93e6548b23a50066c9a1dbce"} err="failed to get container status \"083bb242e6bde224d00d304d1f2d497cdabe3bfd93e6548b23a50066c9a1dbce\": rpc error: code = NotFound desc = could not find container \"083bb242e6bde224d00d304d1f2d497cdabe3bfd93e6548b23a50066c9a1dbce\": container with ID starting with 083bb242e6bde224d00d304d1f2d497cdabe3bfd93e6548b23a50066c9a1dbce not found: ID does not exist" Mar 16 00:40:36 crc kubenswrapper[4816]: I0316 00:40:36.327462 4816 scope.go:117] "RemoveContainer" containerID="5404e0e23149a051c63ece5c9f7e35a40c4b4a2c3bd255f946675c4984f1d005" Mar 16 00:40:36 crc kubenswrapper[4816]: E0316 00:40:36.327717 4816 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5404e0e23149a051c63ece5c9f7e35a40c4b4a2c3bd255f946675c4984f1d005\": container with ID starting with 5404e0e23149a051c63ece5c9f7e35a40c4b4a2c3bd255f946675c4984f1d005 not found: ID does not exist" containerID="5404e0e23149a051c63ece5c9f7e35a40c4b4a2c3bd255f946675c4984f1d005" Mar 16 00:40:36 crc kubenswrapper[4816]: I0316 00:40:36.327748 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5404e0e23149a051c63ece5c9f7e35a40c4b4a2c3bd255f946675c4984f1d005"} err="failed to get container status \"5404e0e23149a051c63ece5c9f7e35a40c4b4a2c3bd255f946675c4984f1d005\": rpc error: code = NotFound desc = could not find container \"5404e0e23149a051c63ece5c9f7e35a40c4b4a2c3bd255f946675c4984f1d005\": container with ID starting with 5404e0e23149a051c63ece5c9f7e35a40c4b4a2c3bd255f946675c4984f1d005 not found: ID does not exist" Mar 16 00:40:36 crc kubenswrapper[4816]: I0316 00:40:36.361118 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9d1b1f79-de52-4ade-9a72-69b86c55e8ff-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "9d1b1f79-de52-4ade-9a72-69b86c55e8ff" (UID: "9d1b1f79-de52-4ade-9a72-69b86c55e8ff"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 16 00:40:36 crc kubenswrapper[4816]: I0316 00:40:36.413284 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8pfdf\" (UniqueName: \"kubernetes.io/projected/9d1b1f79-de52-4ade-9a72-69b86c55e8ff-kube-api-access-8pfdf\") on node \"crc\" DevicePath \"\"" Mar 16 00:40:36 crc kubenswrapper[4816]: I0316 00:40:36.413324 4816 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9d1b1f79-de52-4ade-9a72-69b86c55e8ff-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 16 00:40:36 crc kubenswrapper[4816]: I0316 00:40:36.413333 4816 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9d1b1f79-de52-4ade-9a72-69b86c55e8ff-utilities\") on node \"crc\" DevicePath \"\"" Mar 16 00:40:36 crc kubenswrapper[4816]: I0316 00:40:36.577205 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-6z2gx"] Mar 16 00:40:36 crc kubenswrapper[4816]: I0316 00:40:36.582534 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-6z2gx"] Mar 16 00:40:37 crc kubenswrapper[4816]: I0316 00:40:37.677096 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d1b1f79-de52-4ade-9a72-69b86c55e8ff" path="/var/lib/kubelet/pods/9d1b1f79-de52-4ade-9a72-69b86c55e8ff/volumes" Mar 16 00:40:45 crc kubenswrapper[4816]: I0316 00:40:45.668333 4816 scope.go:117] "RemoveContainer" containerID="ad2d141490ffe2b1f91c3a2296efeb65e936382d1300bb9398efe83e779ccd8a" Mar 16 00:40:45 crc kubenswrapper[4816]: E0316 00:40:45.669434 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jrdcz_openshift-machine-config-operator(dd08ece2-7636-4966-973a-e96a34b70b53)\"" pod="openshift-machine-config-operator/machine-config-daemon-jrdcz" podUID="dd08ece2-7636-4966-973a-e96a34b70b53" Mar 16 00:40:57 crc kubenswrapper[4816]: I0316 00:40:57.686959 4816 scope.go:117] "RemoveContainer" containerID="ad2d141490ffe2b1f91c3a2296efeb65e936382d1300bb9398efe83e779ccd8a" Mar 16 00:40:57 crc kubenswrapper[4816]: E0316 00:40:57.687991 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jrdcz_openshift-machine-config-operator(dd08ece2-7636-4966-973a-e96a34b70b53)\"" pod="openshift-machine-config-operator/machine-config-daemon-jrdcz" podUID="dd08ece2-7636-4966-973a-e96a34b70b53" Mar 16 00:41:05 crc kubenswrapper[4816]: I0316 00:41:05.735754 4816 scope.go:117] "RemoveContainer" containerID="0c26c66eab197680871c2539e7ed1477694cb8e32e0bc0cdad1221a9720899f7" Mar 16 00:41:09 crc kubenswrapper[4816]: I0316 00:41:09.667804 4816 scope.go:117] "RemoveContainer" containerID="ad2d141490ffe2b1f91c3a2296efeb65e936382d1300bb9398efe83e779ccd8a" Mar 16 00:41:09 crc kubenswrapper[4816]: E0316 00:41:09.668688 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jrdcz_openshift-machine-config-operator(dd08ece2-7636-4966-973a-e96a34b70b53)\"" pod="openshift-machine-config-operator/machine-config-daemon-jrdcz" podUID="dd08ece2-7636-4966-973a-e96a34b70b53" Mar 16 00:41:20 crc kubenswrapper[4816]: I0316 00:41:20.668200 4816 scope.go:117] "RemoveContainer" containerID="ad2d141490ffe2b1f91c3a2296efeb65e936382d1300bb9398efe83e779ccd8a" Mar 16 00:41:20 crc kubenswrapper[4816]: E0316 00:41:20.669659 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jrdcz_openshift-machine-config-operator(dd08ece2-7636-4966-973a-e96a34b70b53)\"" pod="openshift-machine-config-operator/machine-config-daemon-jrdcz" podUID="dd08ece2-7636-4966-973a-e96a34b70b53" Mar 16 00:41:26 crc kubenswrapper[4816]: I0316 00:41:26.832546 4816 generic.go:334] "Generic (PLEG): container finished" podID="09265502-9e41-4783-ad8d-206a0b0372c8" containerID="494b4782083857fe979ca56945f86f2f0ad0c22c9919d181363fe40238962eaa" exitCode=0 Mar 16 00:41:26 crc kubenswrapper[4816]: I0316 00:41:26.832590 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-75vc8/must-gather-jdjt6" event={"ID":"09265502-9e41-4783-ad8d-206a0b0372c8","Type":"ContainerDied","Data":"494b4782083857fe979ca56945f86f2f0ad0c22c9919d181363fe40238962eaa"} Mar 16 00:41:26 crc kubenswrapper[4816]: I0316 00:41:26.833612 4816 scope.go:117] "RemoveContainer" containerID="494b4782083857fe979ca56945f86f2f0ad0c22c9919d181363fe40238962eaa" Mar 16 00:41:27 crc kubenswrapper[4816]: I0316 00:41:27.100828 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-75vc8_must-gather-jdjt6_09265502-9e41-4783-ad8d-206a0b0372c8/gather/0.log" Mar 16 00:41:33 crc kubenswrapper[4816]: I0316 00:41:33.668972 4816 scope.go:117] "RemoveContainer" containerID="ad2d141490ffe2b1f91c3a2296efeb65e936382d1300bb9398efe83e779ccd8a" Mar 16 00:41:33 crc kubenswrapper[4816]: I0316 00:41:33.902565 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-jrdcz" event={"ID":"dd08ece2-7636-4966-973a-e96a34b70b53","Type":"ContainerStarted","Data":"08e6183680abbd0a1de1120a16451b7c7144bdc2fd91a97f582aa7293556cc2e"} Mar 16 00:41:34 crc kubenswrapper[4816]: I0316 00:41:34.119202 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-75vc8/must-gather-jdjt6"] Mar 16 00:41:34 crc kubenswrapper[4816]: I0316 00:41:34.119737 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-75vc8/must-gather-jdjt6" podUID="09265502-9e41-4783-ad8d-206a0b0372c8" containerName="copy" containerID="cri-o://8d1fe728d192a689b113e6ab30597764dabd8f0121d2a1200332da846701716a" gracePeriod=2 Mar 16 00:41:34 crc kubenswrapper[4816]: I0316 00:41:34.126031 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-75vc8/must-gather-jdjt6"] Mar 16 00:41:34 crc kubenswrapper[4816]: I0316 00:41:34.467128 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-75vc8_must-gather-jdjt6_09265502-9e41-4783-ad8d-206a0b0372c8/copy/0.log" Mar 16 00:41:34 crc kubenswrapper[4816]: I0316 00:41:34.467858 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-75vc8/must-gather-jdjt6" Mar 16 00:41:34 crc kubenswrapper[4816]: I0316 00:41:34.575476 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4x46w\" (UniqueName: \"kubernetes.io/projected/09265502-9e41-4783-ad8d-206a0b0372c8-kube-api-access-4x46w\") pod \"09265502-9e41-4783-ad8d-206a0b0372c8\" (UID: \"09265502-9e41-4783-ad8d-206a0b0372c8\") " Mar 16 00:41:34 crc kubenswrapper[4816]: I0316 00:41:34.575538 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/09265502-9e41-4783-ad8d-206a0b0372c8-must-gather-output\") pod \"09265502-9e41-4783-ad8d-206a0b0372c8\" (UID: \"09265502-9e41-4783-ad8d-206a0b0372c8\") " Mar 16 00:41:34 crc kubenswrapper[4816]: I0316 00:41:34.580853 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09265502-9e41-4783-ad8d-206a0b0372c8-kube-api-access-4x46w" (OuterVolumeSpecName: "kube-api-access-4x46w") pod "09265502-9e41-4783-ad8d-206a0b0372c8" (UID: "09265502-9e41-4783-ad8d-206a0b0372c8"). InnerVolumeSpecName "kube-api-access-4x46w". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:41:34 crc kubenswrapper[4816]: I0316 00:41:34.638248 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/09265502-9e41-4783-ad8d-206a0b0372c8-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "09265502-9e41-4783-ad8d-206a0b0372c8" (UID: "09265502-9e41-4783-ad8d-206a0b0372c8"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 16 00:41:34 crc kubenswrapper[4816]: I0316 00:41:34.676948 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4x46w\" (UniqueName: \"kubernetes.io/projected/09265502-9e41-4783-ad8d-206a0b0372c8-kube-api-access-4x46w\") on node \"crc\" DevicePath \"\"" Mar 16 00:41:34 crc kubenswrapper[4816]: I0316 00:41:34.676988 4816 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/09265502-9e41-4783-ad8d-206a0b0372c8-must-gather-output\") on node \"crc\" DevicePath \"\"" Mar 16 00:41:34 crc kubenswrapper[4816]: I0316 00:41:34.912236 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-75vc8_must-gather-jdjt6_09265502-9e41-4783-ad8d-206a0b0372c8/copy/0.log" Mar 16 00:41:34 crc kubenswrapper[4816]: I0316 00:41:34.912995 4816 generic.go:334] "Generic (PLEG): container finished" podID="09265502-9e41-4783-ad8d-206a0b0372c8" containerID="8d1fe728d192a689b113e6ab30597764dabd8f0121d2a1200332da846701716a" exitCode=143 Mar 16 00:41:34 crc kubenswrapper[4816]: I0316 00:41:34.913047 4816 scope.go:117] "RemoveContainer" containerID="8d1fe728d192a689b113e6ab30597764dabd8f0121d2a1200332da846701716a" Mar 16 00:41:34 crc kubenswrapper[4816]: I0316 00:41:34.913073 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-75vc8/must-gather-jdjt6" Mar 16 00:41:34 crc kubenswrapper[4816]: I0316 00:41:34.974361 4816 scope.go:117] "RemoveContainer" containerID="494b4782083857fe979ca56945f86f2f0ad0c22c9919d181363fe40238962eaa" Mar 16 00:41:35 crc kubenswrapper[4816]: I0316 00:41:35.009600 4816 scope.go:117] "RemoveContainer" containerID="8d1fe728d192a689b113e6ab30597764dabd8f0121d2a1200332da846701716a" Mar 16 00:41:35 crc kubenswrapper[4816]: E0316 00:41:35.010095 4816 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8d1fe728d192a689b113e6ab30597764dabd8f0121d2a1200332da846701716a\": container with ID starting with 8d1fe728d192a689b113e6ab30597764dabd8f0121d2a1200332da846701716a not found: ID does not exist" containerID="8d1fe728d192a689b113e6ab30597764dabd8f0121d2a1200332da846701716a" Mar 16 00:41:35 crc kubenswrapper[4816]: I0316 00:41:35.010173 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8d1fe728d192a689b113e6ab30597764dabd8f0121d2a1200332da846701716a"} err="failed to get container status \"8d1fe728d192a689b113e6ab30597764dabd8f0121d2a1200332da846701716a\": rpc error: code = NotFound desc = could not find container \"8d1fe728d192a689b113e6ab30597764dabd8f0121d2a1200332da846701716a\": container with ID starting with 8d1fe728d192a689b113e6ab30597764dabd8f0121d2a1200332da846701716a not found: ID does not exist" Mar 16 00:41:35 crc kubenswrapper[4816]: I0316 00:41:35.010209 4816 scope.go:117] "RemoveContainer" containerID="494b4782083857fe979ca56945f86f2f0ad0c22c9919d181363fe40238962eaa" Mar 16 00:41:35 crc kubenswrapper[4816]: E0316 00:41:35.010628 4816 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"494b4782083857fe979ca56945f86f2f0ad0c22c9919d181363fe40238962eaa\": container with ID starting with 494b4782083857fe979ca56945f86f2f0ad0c22c9919d181363fe40238962eaa not found: ID does not exist" containerID="494b4782083857fe979ca56945f86f2f0ad0c22c9919d181363fe40238962eaa" Mar 16 00:41:35 crc kubenswrapper[4816]: I0316 00:41:35.010659 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"494b4782083857fe979ca56945f86f2f0ad0c22c9919d181363fe40238962eaa"} err="failed to get container status \"494b4782083857fe979ca56945f86f2f0ad0c22c9919d181363fe40238962eaa\": rpc error: code = NotFound desc = could not find container \"494b4782083857fe979ca56945f86f2f0ad0c22c9919d181363fe40238962eaa\": container with ID starting with 494b4782083857fe979ca56945f86f2f0ad0c22c9919d181363fe40238962eaa not found: ID does not exist" Mar 16 00:41:35 crc kubenswrapper[4816]: I0316 00:41:35.675222 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09265502-9e41-4783-ad8d-206a0b0372c8" path="/var/lib/kubelet/pods/09265502-9e41-4783-ad8d-206a0b0372c8/volumes" Mar 16 00:42:00 crc kubenswrapper[4816]: I0316 00:42:00.166816 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29560362-87zl7"] Mar 16 00:42:00 crc kubenswrapper[4816]: E0316 00:42:00.167833 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4f54343a-e5a5-4a0f-9940-5dabdbea5927" containerName="extract-utilities" Mar 16 00:42:00 crc kubenswrapper[4816]: I0316 00:42:00.167856 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="4f54343a-e5a5-4a0f-9940-5dabdbea5927" containerName="extract-utilities" Mar 16 00:42:00 crc kubenswrapper[4816]: E0316 00:42:00.167876 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="09265502-9e41-4783-ad8d-206a0b0372c8" containerName="copy" Mar 16 00:42:00 crc kubenswrapper[4816]: I0316 00:42:00.167893 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="09265502-9e41-4783-ad8d-206a0b0372c8" containerName="copy" Mar 16 00:42:00 crc kubenswrapper[4816]: E0316 00:42:00.167907 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4f54343a-e5a5-4a0f-9940-5dabdbea5927" containerName="extract-content" Mar 16 00:42:00 crc kubenswrapper[4816]: I0316 00:42:00.167919 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="4f54343a-e5a5-4a0f-9940-5dabdbea5927" containerName="extract-content" Mar 16 00:42:00 crc kubenswrapper[4816]: E0316 00:42:00.167935 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9d1b1f79-de52-4ade-9a72-69b86c55e8ff" containerName="registry-server" Mar 16 00:42:00 crc kubenswrapper[4816]: I0316 00:42:00.167947 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="9d1b1f79-de52-4ade-9a72-69b86c55e8ff" containerName="registry-server" Mar 16 00:42:00 crc kubenswrapper[4816]: E0316 00:42:00.167969 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9d1b1f79-de52-4ade-9a72-69b86c55e8ff" containerName="extract-content" Mar 16 00:42:00 crc kubenswrapper[4816]: I0316 00:42:00.167981 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="9d1b1f79-de52-4ade-9a72-69b86c55e8ff" containerName="extract-content" Mar 16 00:42:00 crc kubenswrapper[4816]: E0316 00:42:00.167994 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4f54343a-e5a5-4a0f-9940-5dabdbea5927" containerName="registry-server" Mar 16 00:42:00 crc kubenswrapper[4816]: I0316 00:42:00.168005 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="4f54343a-e5a5-4a0f-9940-5dabdbea5927" containerName="registry-server" Mar 16 00:42:00 crc kubenswrapper[4816]: E0316 00:42:00.168029 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9d1b1f79-de52-4ade-9a72-69b86c55e8ff" containerName="extract-utilities" Mar 16 00:42:00 crc kubenswrapper[4816]: I0316 00:42:00.168041 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="9d1b1f79-de52-4ade-9a72-69b86c55e8ff" containerName="extract-utilities" Mar 16 00:42:00 crc kubenswrapper[4816]: E0316 00:42:00.168064 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="09265502-9e41-4783-ad8d-206a0b0372c8" containerName="gather" Mar 16 00:42:00 crc kubenswrapper[4816]: I0316 00:42:00.168077 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="09265502-9e41-4783-ad8d-206a0b0372c8" containerName="gather" Mar 16 00:42:00 crc kubenswrapper[4816]: I0316 00:42:00.168294 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="9d1b1f79-de52-4ade-9a72-69b86c55e8ff" containerName="registry-server" Mar 16 00:42:00 crc kubenswrapper[4816]: I0316 00:42:00.168316 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="09265502-9e41-4783-ad8d-206a0b0372c8" containerName="gather" Mar 16 00:42:00 crc kubenswrapper[4816]: I0316 00:42:00.168333 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="4f54343a-e5a5-4a0f-9940-5dabdbea5927" containerName="registry-server" Mar 16 00:42:00 crc kubenswrapper[4816]: I0316 00:42:00.168351 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="09265502-9e41-4783-ad8d-206a0b0372c8" containerName="copy" Mar 16 00:42:00 crc kubenswrapper[4816]: I0316 00:42:00.169082 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29560362-87zl7" Mar 16 00:42:00 crc kubenswrapper[4816]: I0316 00:42:00.172812 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 16 00:42:00 crc kubenswrapper[4816]: I0316 00:42:00.172917 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-8hc2r" Mar 16 00:42:00 crc kubenswrapper[4816]: I0316 00:42:00.172996 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 16 00:42:00 crc kubenswrapper[4816]: I0316 00:42:00.180430 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29560362-87zl7"] Mar 16 00:42:00 crc kubenswrapper[4816]: I0316 00:42:00.360774 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7tlfc\" (UniqueName: \"kubernetes.io/projected/31ff5b41-45f3-4978-aed9-4ea3cf4d6736-kube-api-access-7tlfc\") pod \"auto-csr-approver-29560362-87zl7\" (UID: \"31ff5b41-45f3-4978-aed9-4ea3cf4d6736\") " pod="openshift-infra/auto-csr-approver-29560362-87zl7" Mar 16 00:42:00 crc kubenswrapper[4816]: I0316 00:42:00.469065 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7tlfc\" (UniqueName: \"kubernetes.io/projected/31ff5b41-45f3-4978-aed9-4ea3cf4d6736-kube-api-access-7tlfc\") pod \"auto-csr-approver-29560362-87zl7\" (UID: \"31ff5b41-45f3-4978-aed9-4ea3cf4d6736\") " pod="openshift-infra/auto-csr-approver-29560362-87zl7" Mar 16 00:42:00 crc kubenswrapper[4816]: I0316 00:42:00.502676 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7tlfc\" (UniqueName: \"kubernetes.io/projected/31ff5b41-45f3-4978-aed9-4ea3cf4d6736-kube-api-access-7tlfc\") pod \"auto-csr-approver-29560362-87zl7\" (UID: \"31ff5b41-45f3-4978-aed9-4ea3cf4d6736\") " pod="openshift-infra/auto-csr-approver-29560362-87zl7" Mar 16 00:42:00 crc kubenswrapper[4816]: I0316 00:42:00.802343 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29560362-87zl7" Mar 16 00:42:01 crc kubenswrapper[4816]: I0316 00:42:01.044167 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29560362-87zl7"] Mar 16 00:42:01 crc kubenswrapper[4816]: I0316 00:42:01.170115 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29560362-87zl7" event={"ID":"31ff5b41-45f3-4978-aed9-4ea3cf4d6736","Type":"ContainerStarted","Data":"3b9aacb92328011ed3e7b23460f31c960e43bddb1bb6b0ed2e7b1fde4055ad1a"} Mar 16 00:42:03 crc kubenswrapper[4816]: I0316 00:42:03.195270 4816 generic.go:334] "Generic (PLEG): container finished" podID="31ff5b41-45f3-4978-aed9-4ea3cf4d6736" containerID="9c3a48820610a85f7b8e12ac2256cf65b198b7f676046e8539e7967f030e28c5" exitCode=0 Mar 16 00:42:03 crc kubenswrapper[4816]: I0316 00:42:03.195349 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29560362-87zl7" event={"ID":"31ff5b41-45f3-4978-aed9-4ea3cf4d6736","Type":"ContainerDied","Data":"9c3a48820610a85f7b8e12ac2256cf65b198b7f676046e8539e7967f030e28c5"} Mar 16 00:42:04 crc kubenswrapper[4816]: I0316 00:42:04.482532 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29560362-87zl7" Mar 16 00:42:04 crc kubenswrapper[4816]: I0316 00:42:04.636115 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7tlfc\" (UniqueName: \"kubernetes.io/projected/31ff5b41-45f3-4978-aed9-4ea3cf4d6736-kube-api-access-7tlfc\") pod \"31ff5b41-45f3-4978-aed9-4ea3cf4d6736\" (UID: \"31ff5b41-45f3-4978-aed9-4ea3cf4d6736\") " Mar 16 00:42:04 crc kubenswrapper[4816]: I0316 00:42:04.645049 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31ff5b41-45f3-4978-aed9-4ea3cf4d6736-kube-api-access-7tlfc" (OuterVolumeSpecName: "kube-api-access-7tlfc") pod "31ff5b41-45f3-4978-aed9-4ea3cf4d6736" (UID: "31ff5b41-45f3-4978-aed9-4ea3cf4d6736"). InnerVolumeSpecName "kube-api-access-7tlfc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:42:04 crc kubenswrapper[4816]: I0316 00:42:04.738195 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7tlfc\" (UniqueName: \"kubernetes.io/projected/31ff5b41-45f3-4978-aed9-4ea3cf4d6736-kube-api-access-7tlfc\") on node \"crc\" DevicePath \"\"" Mar 16 00:42:05 crc kubenswrapper[4816]: I0316 00:42:05.216279 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29560362-87zl7" event={"ID":"31ff5b41-45f3-4978-aed9-4ea3cf4d6736","Type":"ContainerDied","Data":"3b9aacb92328011ed3e7b23460f31c960e43bddb1bb6b0ed2e7b1fde4055ad1a"} Mar 16 00:42:05 crc kubenswrapper[4816]: I0316 00:42:05.216317 4816 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3b9aacb92328011ed3e7b23460f31c960e43bddb1bb6b0ed2e7b1fde4055ad1a" Mar 16 00:42:05 crc kubenswrapper[4816]: I0316 00:42:05.216383 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29560362-87zl7" Mar 16 00:42:05 crc kubenswrapper[4816]: I0316 00:42:05.565128 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29560356-vl86r"] Mar 16 00:42:05 crc kubenswrapper[4816]: I0316 00:42:05.568658 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29560356-vl86r"] Mar 16 00:42:05 crc kubenswrapper[4816]: I0316 00:42:05.682002 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0ffa2c59-99b2-4d5c-9c19-f0921ce688cb" path="/var/lib/kubelet/pods/0ffa2c59-99b2-4d5c-9c19-f0921ce688cb/volumes" Mar 16 00:42:05 crc kubenswrapper[4816]: I0316 00:42:05.868956 4816 scope.go:117] "RemoveContainer" containerID="a31a7a7d2a1fafc20c9ab619317cd87262faab560369319826f6c184261023c4"